diff --git a/README.md b/README.md new file mode 100644 index 0000000..61865f6 --- /dev/null +++ b/README.md @@ -0,0 +1,107 @@ +# ViLBERT + +Code and pre-trained models for **ViLBERT: Pretraining Task-Agnostic VisiolinguisticRepresentations for Vision-and-Language Tasks**. + + +## Repository Setup + +1. Create a fresh conda environment, and install all dependencies. + +```text +conda create -n vilbert python=3.6 +conda activate vilbert +git clone https://github.com/jiasenlu/ViLBert +cd ViLBert +pip install -r requirements.txt +``` + +2. Install pytorch +``` +conda install pytorch torchvision cudatoolkit=10.0 -c pytorch +``` + +3. Install apx, follows https://github.com/NVIDIA/apex + +4. Install this codebase as a package in this environment. +```text +python setup.py develop +``` + +## Data Setup + +Check `README.md` under `data` for more details. + +## Visiolinguistic Pre-training + +To train the model: + +``` +To be added +``` + +For internal use: copy the pre-trained checkpoint from Skynet + +``` +cp -a /srv/share3/jlu347/vilbert/save/* #to_your_directory. +``` + +## Benchmark Vision-Lanugage Tasks + +| Task | Sub-Task | Model | LR | Results (split) | +| :-----------------------: | :---------------: | :---------: | :--: | :-----------------------------------------------------: | +| **VQA** | - | **ViLBERT** | 4e-5 | **70.55** (test-dev) | +| - | - | DFAF | - | 70.22 (test-dev) | +| **VCR** | Q->A | **ViLBERT** | 2e-5 | **73.3** (test) | +| - | Q->A | R2C | - | 63.8 (test) | +| **VCR** | QA->R | **ViLBERT** | 2e-5 | **74.6** (test) | +| - | QA->R | R2C | - | 67.3 (test) | +| **VCR** | Q->AR | **ViLBERT** | 2e-5 | **54.8** (test) | +| - | Q->AR | R2C | - | 44.0 (test) | +| **Ref Expression** | RefCOCO+ | **ViLBERT** | 4e-5 | **72.34** (val) - **78.52** (testA) - **62.61** (testB) | +| - | RefCOCO+ | MAttNet | - | 65.33 (val) - 71.62 (testA) - 56.02 (testB) | +| **Ref Expression** | RefCOCO | **ViLBERT** | 4e-5 | - | +| - | RefCOCO | MAttNet | - | - | +| **Ref Expression** | Refg | **ViLBERT** | 4e-5 | - | +| - | Refg | MAttNet | - | - | +| **Image Caption Ranking** | Image Retrieval | **ViLBERT** | 2e-5 | **58.20** (R1) - **84.90** (R5) - **91.52** (R10) | +| - | Image Retrieval | SCAN | - | 48.60 (R1) - 77.70 (R5) - 85.20 (R10) | +| **Image Caption Ranking** | Caption Retrieval | **ViLBERT** | 2e-5 | - | +| - | Caption Retrieval | SCAN | - | - | + + +## TASKS +### VQA + +To fintune a 6-layer vilbert model for VQA with 8 GPU. `--tasks 1` means VQA tasks. Check `vlbert_tasks.yml` for more settings for VQA tasks. + +```bash +python -m torch.distributed.launch --nproc_per_node=8 --nnodes=1 --node_rank=0 train_tasks.py --bert_model bert-base-uncased --from_pretrained save/bert_base_6_layer_6_connect_freeze_0/pytorch_model_8.bin --config_file config/bert_base_6layer_6conect.json --learning_rate 4e-5 --num_workers 16 --tasks 1 --save_name pretrained +``` + +### VCR + +Similarly, to finetune a 6-layer vilbert model for VCR task, run the following commands. Here we joint train `Q->A ` and `QA->R` tasks, so the tasks is specified as `--tasks 6-7` + +``` +python -m torch.distributed.launch --nproc_per_node=8 --nnodes=1 --node_rank=0 train_tasks.py --bert_model bert-base-uncased --from_pretrained save/bert_base_6_layer_6_connect_freeze_0/pytorch_model_8.bin --config_file config/bert_base_6layer_6conect.json --learning_rate 2e-5 --num_workers 16 --tasks 6-7 --save_name pretrained +``` + +### Refer Expression +``` +python -m torch.distributed.launch --nproc_per_node=8 --nnodes=1 --node_rank=0 train_tasks.py --bert_model bert-base-uncased --from_pretrained save/bert_base_6_layer_6_connect_freeze_0/pytorch_model_8.bin --config_file config/bert_base_6layer_6conect.json --learning_rate 4e-5 --num_workers 16 --tasks 11 --save_name pretrained +``` + +### Image Retrieval +``` +python -m torch.distributed.launch --nproc_per_node=8 --nnodes=1 --node_rank=0 train_tasks.py --bert_model bert-base-uncased --from_pretrained save/bert_base_6_layer_6_connect_freeze_0/pytorch_model_8.bin --config_file config/bert_base_6layer_6conect.json --learning_rate 4e-5 --num_workers 9 --tasks 11 --save_name pretrained +``` + +### Add your own tasks +``` + +``` +## Why does ViLBERT look like ? + +

+ +

\ No newline at end of file diff --git a/config/bert-base-uncased_weight_name.json b/config/bert-base-uncased_weight_name.json new file mode 100644 index 0000000..e144d69 --- /dev/null +++ b/config/bert-base-uncased_weight_name.json @@ -0,0 +1 @@ +["embeddings.word_embeddings.weight", "embeddings.position_embeddings.weight", "embeddings.token_type_embeddings.weight", "embeddings.LayerNorm.weight", "embeddings.LayerNorm.bias", "encoder.layer.0.attention.self.query.weight", "encoder.layer.0.attention.self.query.bias", "encoder.layer.0.attention.self.key.weight", "encoder.layer.0.attention.self.key.bias", "encoder.layer.0.attention.self.value.weight", "encoder.layer.0.attention.self.value.bias", "encoder.layer.0.attention.output.dense.weight", "encoder.layer.0.attention.output.dense.bias", "encoder.layer.0.attention.output.LayerNorm.weight", "encoder.layer.0.attention.output.LayerNorm.bias", "encoder.layer.0.intermediate.dense.weight", "encoder.layer.0.intermediate.dense.bias", "encoder.layer.0.output.dense.weight", "encoder.layer.0.output.dense.bias", "encoder.layer.0.output.LayerNorm.weight", "encoder.layer.0.output.LayerNorm.bias", "encoder.layer.1.attention.self.query.weight", "encoder.layer.1.attention.self.query.bias", "encoder.layer.1.attention.self.key.weight", "encoder.layer.1.attention.self.key.bias", "encoder.layer.1.attention.self.value.weight", "encoder.layer.1.attention.self.value.bias", "encoder.layer.1.attention.output.dense.weight", "encoder.layer.1.attention.output.dense.bias", "encoder.layer.1.attention.output.LayerNorm.weight", "encoder.layer.1.attention.output.LayerNorm.bias", "encoder.layer.1.intermediate.dense.weight", "encoder.layer.1.intermediate.dense.bias", "encoder.layer.1.output.dense.weight", "encoder.layer.1.output.dense.bias", "encoder.layer.1.output.LayerNorm.weight", "encoder.layer.1.output.LayerNorm.bias", "encoder.layer.2.attention.self.query.weight", "encoder.layer.2.attention.self.query.bias", "encoder.layer.2.attention.self.key.weight", "encoder.layer.2.attention.self.key.bias", "encoder.layer.2.attention.self.value.weight", "encoder.layer.2.attention.self.value.bias", "encoder.layer.2.attention.output.dense.weight", "encoder.layer.2.attention.output.dense.bias", "encoder.layer.2.attention.output.LayerNorm.weight", "encoder.layer.2.attention.output.LayerNorm.bias", "encoder.layer.2.intermediate.dense.weight", "encoder.layer.2.intermediate.dense.bias", "encoder.layer.2.output.dense.weight", "encoder.layer.2.output.dense.bias", "encoder.layer.2.output.LayerNorm.weight", "encoder.layer.2.output.LayerNorm.bias", "encoder.layer.3.attention.self.query.weight", "encoder.layer.3.attention.self.query.bias", "encoder.layer.3.attention.self.key.weight", "encoder.layer.3.attention.self.key.bias", "encoder.layer.3.attention.self.value.weight", "encoder.layer.3.attention.self.value.bias", "encoder.layer.3.attention.output.dense.weight", "encoder.layer.3.attention.output.dense.bias", "encoder.layer.3.attention.output.LayerNorm.weight", "encoder.layer.3.attention.output.LayerNorm.bias", "encoder.layer.3.intermediate.dense.weight", "encoder.layer.3.intermediate.dense.bias", "encoder.layer.3.output.dense.weight", "encoder.layer.3.output.dense.bias", "encoder.layer.3.output.LayerNorm.weight", "encoder.layer.3.output.LayerNorm.bias", "encoder.layer.4.attention.self.query.weight", "encoder.layer.4.attention.self.query.bias", "encoder.layer.4.attention.self.key.weight", "encoder.layer.4.attention.self.key.bias", "encoder.layer.4.attention.self.value.weight", "encoder.layer.4.attention.self.value.bias", "encoder.layer.4.attention.output.dense.weight", "encoder.layer.4.attention.output.dense.bias", "encoder.layer.4.attention.output.LayerNorm.weight", "encoder.layer.4.attention.output.LayerNorm.bias", "encoder.layer.4.intermediate.dense.weight", "encoder.layer.4.intermediate.dense.bias", "encoder.layer.4.output.dense.weight", "encoder.layer.4.output.dense.bias", "encoder.layer.4.output.LayerNorm.weight", "encoder.layer.4.output.LayerNorm.bias", "encoder.layer.5.attention.self.query.weight", "encoder.layer.5.attention.self.query.bias", "encoder.layer.5.attention.self.key.weight", "encoder.layer.5.attention.self.key.bias", "encoder.layer.5.attention.self.value.weight", "encoder.layer.5.attention.self.value.bias", "encoder.layer.5.attention.output.dense.weight", "encoder.layer.5.attention.output.dense.bias", "encoder.layer.5.attention.output.LayerNorm.weight", "encoder.layer.5.attention.output.LayerNorm.bias", "encoder.layer.5.intermediate.dense.weight", "encoder.layer.5.intermediate.dense.bias", "encoder.layer.5.output.dense.weight", "encoder.layer.5.output.dense.bias", "encoder.layer.5.output.LayerNorm.weight", "encoder.layer.5.output.LayerNorm.bias", "encoder.layer.6.attention.self.query.weight", "encoder.layer.6.attention.self.query.bias", "encoder.layer.6.attention.self.key.weight", "encoder.layer.6.attention.self.key.bias", "encoder.layer.6.attention.self.value.weight", "encoder.layer.6.attention.self.value.bias", "encoder.layer.6.attention.output.dense.weight", "encoder.layer.6.attention.output.dense.bias", "encoder.layer.6.attention.output.LayerNorm.weight", "encoder.layer.6.attention.output.LayerNorm.bias", "encoder.layer.6.intermediate.dense.weight", "encoder.layer.6.intermediate.dense.bias", "encoder.layer.6.output.dense.weight", "encoder.layer.6.output.dense.bias", "encoder.layer.6.output.LayerNorm.weight", "encoder.layer.6.output.LayerNorm.bias", "encoder.layer.7.attention.self.query.weight", "encoder.layer.7.attention.self.query.bias", "encoder.layer.7.attention.self.key.weight", "encoder.layer.7.attention.self.key.bias", "encoder.layer.7.attention.self.value.weight", "encoder.layer.7.attention.self.value.bias", "encoder.layer.7.attention.output.dense.weight", "encoder.layer.7.attention.output.dense.bias", "encoder.layer.7.attention.output.LayerNorm.weight", "encoder.layer.7.attention.output.LayerNorm.bias", "encoder.layer.7.intermediate.dense.weight", "encoder.layer.7.intermediate.dense.bias", "encoder.layer.7.output.dense.weight", "encoder.layer.7.output.dense.bias", "encoder.layer.7.output.LayerNorm.weight", "encoder.layer.7.output.LayerNorm.bias", "encoder.layer.8.attention.self.query.weight", "encoder.layer.8.attention.self.query.bias", "encoder.layer.8.attention.self.key.weight", "encoder.layer.8.attention.self.key.bias", "encoder.layer.8.attention.self.value.weight", "encoder.layer.8.attention.self.value.bias", "encoder.layer.8.attention.output.dense.weight", "encoder.layer.8.attention.output.dense.bias", "encoder.layer.8.attention.output.LayerNorm.weight", "encoder.layer.8.attention.output.LayerNorm.bias", "encoder.layer.8.intermediate.dense.weight", "encoder.layer.8.intermediate.dense.bias", "encoder.layer.8.output.dense.weight", "encoder.layer.8.output.dense.bias", "encoder.layer.8.output.LayerNorm.weight", "encoder.layer.8.output.LayerNorm.bias", "encoder.layer.9.attention.self.query.weight", "encoder.layer.9.attention.self.query.bias", "encoder.layer.9.attention.self.key.weight", "encoder.layer.9.attention.self.key.bias", "encoder.layer.9.attention.self.value.weight", "encoder.layer.9.attention.self.value.bias", "encoder.layer.9.attention.output.dense.weight", "encoder.layer.9.attention.output.dense.bias", "encoder.layer.9.attention.output.LayerNorm.weight", "encoder.layer.9.attention.output.LayerNorm.bias", "encoder.layer.9.intermediate.dense.weight", "encoder.layer.9.intermediate.dense.bias", "encoder.layer.9.output.dense.weight", "encoder.layer.9.output.dense.bias", "encoder.layer.9.output.LayerNorm.weight", "encoder.layer.9.output.LayerNorm.bias", "encoder.layer.10.attention.self.query.weight", "encoder.layer.10.attention.self.query.bias", "encoder.layer.10.attention.self.key.weight", "encoder.layer.10.attention.self.key.bias", "encoder.layer.10.attention.self.value.weight", "encoder.layer.10.attention.self.value.bias", "encoder.layer.10.attention.output.dense.weight", "encoder.layer.10.attention.output.dense.bias", "encoder.layer.10.attention.output.LayerNorm.weight", "encoder.layer.10.attention.output.LayerNorm.bias", "encoder.layer.10.intermediate.dense.weight", "encoder.layer.10.intermediate.dense.bias", "encoder.layer.10.output.dense.weight", "encoder.layer.10.output.dense.bias", "encoder.layer.10.output.LayerNorm.weight", "encoder.layer.10.output.LayerNorm.bias", "encoder.layer.11.attention.self.query.weight", "encoder.layer.11.attention.self.query.bias", "encoder.layer.11.attention.self.key.weight", "encoder.layer.11.attention.self.key.bias", "encoder.layer.11.attention.self.value.weight", "encoder.layer.11.attention.self.value.bias", "encoder.layer.11.attention.output.dense.weight", "encoder.layer.11.attention.output.dense.bias", "encoder.layer.11.attention.output.LayerNorm.weight", "encoder.layer.11.attention.output.LayerNorm.bias", "encoder.layer.11.intermediate.dense.weight", "encoder.layer.11.intermediate.dense.bias", "encoder.layer.11.output.dense.weight", "encoder.layer.11.output.dense.bias", "encoder.layer.11.output.LayerNorm.weight", "encoder.layer.11.output.LayerNorm.bias"] \ No newline at end of file diff --git a/config/bert-large-uncased_weight_name.json b/config/bert-large-uncased_weight_name.json new file mode 100644 index 0000000..97daec5 --- /dev/null +++ b/config/bert-large-uncased_weight_name.json @@ -0,0 +1 @@ +["embeddings.word_embeddings.weight", "embeddings.position_embeddings.weight", "embeddings.token_type_embeddings.weight", "embeddings.LayerNorm.weight", "embeddings.LayerNorm.bias", "encoder.layer.0.attention.self.query.weight", "encoder.layer.0.attention.self.query.bias", "encoder.layer.0.attention.self.key.weight", "encoder.layer.0.attention.self.key.bias", "encoder.layer.0.attention.self.value.weight", "encoder.layer.0.attention.self.value.bias", "encoder.layer.0.attention.output.dense.weight", "encoder.layer.0.attention.output.dense.bias", "encoder.layer.0.attention.output.LayerNorm.weight", "encoder.layer.0.attention.output.LayerNorm.bias", "encoder.layer.0.intermediate.dense.weight", "encoder.layer.0.intermediate.dense.bias", "encoder.layer.0.output.dense.weight", "encoder.layer.0.output.dense.bias", "encoder.layer.0.output.LayerNorm.weight", "encoder.layer.0.output.LayerNorm.bias", "encoder.layer.1.attention.self.query.weight", "encoder.layer.1.attention.self.query.bias", "encoder.layer.1.attention.self.key.weight", "encoder.layer.1.attention.self.key.bias", "encoder.layer.1.attention.self.value.weight", "encoder.layer.1.attention.self.value.bias", "encoder.layer.1.attention.output.dense.weight", "encoder.layer.1.attention.output.dense.bias", "encoder.layer.1.attention.output.LayerNorm.weight", "encoder.layer.1.attention.output.LayerNorm.bias", "encoder.layer.1.intermediate.dense.weight", "encoder.layer.1.intermediate.dense.bias", "encoder.layer.1.output.dense.weight", "encoder.layer.1.output.dense.bias", "encoder.layer.1.output.LayerNorm.weight", "encoder.layer.1.output.LayerNorm.bias", "encoder.layer.2.attention.self.query.weight", "encoder.layer.2.attention.self.query.bias", "encoder.layer.2.attention.self.key.weight", "encoder.layer.2.attention.self.key.bias", "encoder.layer.2.attention.self.value.weight", "encoder.layer.2.attention.self.value.bias", "encoder.layer.2.attention.output.dense.weight", "encoder.layer.2.attention.output.dense.bias", "encoder.layer.2.attention.output.LayerNorm.weight", "encoder.layer.2.attention.output.LayerNorm.bias", "encoder.layer.2.intermediate.dense.weight", "encoder.layer.2.intermediate.dense.bias", "encoder.layer.2.output.dense.weight", "encoder.layer.2.output.dense.bias", "encoder.layer.2.output.LayerNorm.weight", "encoder.layer.2.output.LayerNorm.bias", "encoder.layer.3.attention.self.query.weight", "encoder.layer.3.attention.self.query.bias", "encoder.layer.3.attention.self.key.weight", "encoder.layer.3.attention.self.key.bias", "encoder.layer.3.attention.self.value.weight", "encoder.layer.3.attention.self.value.bias", "encoder.layer.3.attention.output.dense.weight", "encoder.layer.3.attention.output.dense.bias", "encoder.layer.3.attention.output.LayerNorm.weight", "encoder.layer.3.attention.output.LayerNorm.bias", "encoder.layer.3.intermediate.dense.weight", "encoder.layer.3.intermediate.dense.bias", "encoder.layer.3.output.dense.weight", "encoder.layer.3.output.dense.bias", "encoder.layer.3.output.LayerNorm.weight", "encoder.layer.3.output.LayerNorm.bias", "encoder.layer.4.attention.self.query.weight", "encoder.layer.4.attention.self.query.bias", "encoder.layer.4.attention.self.key.weight", "encoder.layer.4.attention.self.key.bias", "encoder.layer.4.attention.self.value.weight", "encoder.layer.4.attention.self.value.bias", "encoder.layer.4.attention.output.dense.weight", "encoder.layer.4.attention.output.dense.bias", "encoder.layer.4.attention.output.LayerNorm.weight", "encoder.layer.4.attention.output.LayerNorm.bias", "encoder.layer.4.intermediate.dense.weight", "encoder.layer.4.intermediate.dense.bias", "encoder.layer.4.output.dense.weight", "encoder.layer.4.output.dense.bias", "encoder.layer.4.output.LayerNorm.weight", "encoder.layer.4.output.LayerNorm.bias", "encoder.layer.5.attention.self.query.weight", "encoder.layer.5.attention.self.query.bias", "encoder.layer.5.attention.self.key.weight", "encoder.layer.5.attention.self.key.bias", "encoder.layer.5.attention.self.value.weight", "encoder.layer.5.attention.self.value.bias", "encoder.layer.5.attention.output.dense.weight", "encoder.layer.5.attention.output.dense.bias", "encoder.layer.5.attention.output.LayerNorm.weight", "encoder.layer.5.attention.output.LayerNorm.bias", "encoder.layer.5.intermediate.dense.weight", "encoder.layer.5.intermediate.dense.bias", "encoder.layer.5.output.dense.weight", "encoder.layer.5.output.dense.bias", "encoder.layer.5.output.LayerNorm.weight", "encoder.layer.5.output.LayerNorm.bias", "encoder.layer.6.attention.self.query.weight", "encoder.layer.6.attention.self.query.bias", "encoder.layer.6.attention.self.key.weight", "encoder.layer.6.attention.self.key.bias", "encoder.layer.6.attention.self.value.weight", "encoder.layer.6.attention.self.value.bias", "encoder.layer.6.attention.output.dense.weight", "encoder.layer.6.attention.output.dense.bias", "encoder.layer.6.attention.output.LayerNorm.weight", "encoder.layer.6.attention.output.LayerNorm.bias", "encoder.layer.6.intermediate.dense.weight", "encoder.layer.6.intermediate.dense.bias", "encoder.layer.6.output.dense.weight", "encoder.layer.6.output.dense.bias", "encoder.layer.6.output.LayerNorm.weight", "encoder.layer.6.output.LayerNorm.bias", "encoder.layer.7.attention.self.query.weight", "encoder.layer.7.attention.self.query.bias", "encoder.layer.7.attention.self.key.weight", "encoder.layer.7.attention.self.key.bias", "encoder.layer.7.attention.self.value.weight", "encoder.layer.7.attention.self.value.bias", "encoder.layer.7.attention.output.dense.weight", "encoder.layer.7.attention.output.dense.bias", "encoder.layer.7.attention.output.LayerNorm.weight", "encoder.layer.7.attention.output.LayerNorm.bias", "encoder.layer.7.intermediate.dense.weight", "encoder.layer.7.intermediate.dense.bias", "encoder.layer.7.output.dense.weight", "encoder.layer.7.output.dense.bias", "encoder.layer.7.output.LayerNorm.weight", "encoder.layer.7.output.LayerNorm.bias", "encoder.layer.8.attention.self.query.weight", "encoder.layer.8.attention.self.query.bias", "encoder.layer.8.attention.self.key.weight", "encoder.layer.8.attention.self.key.bias", "encoder.layer.8.attention.self.value.weight", "encoder.layer.8.attention.self.value.bias", "encoder.layer.8.attention.output.dense.weight", "encoder.layer.8.attention.output.dense.bias", "encoder.layer.8.attention.output.LayerNorm.weight", "encoder.layer.8.attention.output.LayerNorm.bias", "encoder.layer.8.intermediate.dense.weight", "encoder.layer.8.intermediate.dense.bias", "encoder.layer.8.output.dense.weight", "encoder.layer.8.output.dense.bias", "encoder.layer.8.output.LayerNorm.weight", "encoder.layer.8.output.LayerNorm.bias", "encoder.layer.9.attention.self.query.weight", "encoder.layer.9.attention.self.query.bias", "encoder.layer.9.attention.self.key.weight", "encoder.layer.9.attention.self.key.bias", "encoder.layer.9.attention.self.value.weight", "encoder.layer.9.attention.self.value.bias", "encoder.layer.9.attention.output.dense.weight", "encoder.layer.9.attention.output.dense.bias", "encoder.layer.9.attention.output.LayerNorm.weight", "encoder.layer.9.attention.output.LayerNorm.bias", "encoder.layer.9.intermediate.dense.weight", "encoder.layer.9.intermediate.dense.bias", "encoder.layer.9.output.dense.weight", "encoder.layer.9.output.dense.bias", "encoder.layer.9.output.LayerNorm.weight", "encoder.layer.9.output.LayerNorm.bias", "encoder.layer.10.attention.self.query.weight", "encoder.layer.10.attention.self.query.bias", "encoder.layer.10.attention.self.key.weight", "encoder.layer.10.attention.self.key.bias", "encoder.layer.10.attention.self.value.weight", "encoder.layer.10.attention.self.value.bias", "encoder.layer.10.attention.output.dense.weight", "encoder.layer.10.attention.output.dense.bias", "encoder.layer.10.attention.output.LayerNorm.weight", "encoder.layer.10.attention.output.LayerNorm.bias", "encoder.layer.10.intermediate.dense.weight", "encoder.layer.10.intermediate.dense.bias", "encoder.layer.10.output.dense.weight", "encoder.layer.10.output.dense.bias", "encoder.layer.10.output.LayerNorm.weight", "encoder.layer.10.output.LayerNorm.bias", "encoder.layer.11.attention.self.query.weight", "encoder.layer.11.attention.self.query.bias", "encoder.layer.11.attention.self.key.weight", "encoder.layer.11.attention.self.key.bias", "encoder.layer.11.attention.self.value.weight", "encoder.layer.11.attention.self.value.bias", "encoder.layer.11.attention.output.dense.weight", "encoder.layer.11.attention.output.dense.bias", "encoder.layer.11.attention.output.LayerNorm.weight", "encoder.layer.11.attention.output.LayerNorm.bias", "encoder.layer.11.intermediate.dense.weight", "encoder.layer.11.intermediate.dense.bias", "encoder.layer.11.output.dense.weight", "encoder.layer.11.output.dense.bias", "encoder.layer.11.output.LayerNorm.weight", "encoder.layer.11.output.LayerNorm.bias", "encoder.layer.12.attention.self.query.weight", "encoder.layer.12.attention.self.query.bias", "encoder.layer.12.attention.self.key.weight", "encoder.layer.12.attention.self.key.bias", "encoder.layer.12.attention.self.value.weight", "encoder.layer.12.attention.self.value.bias", "encoder.layer.12.attention.output.dense.weight", "encoder.layer.12.attention.output.dense.bias", "encoder.layer.12.attention.output.LayerNorm.weight", "encoder.layer.12.attention.output.LayerNorm.bias", "encoder.layer.12.intermediate.dense.weight", "encoder.layer.12.intermediate.dense.bias", "encoder.layer.12.output.dense.weight", "encoder.layer.12.output.dense.bias", "encoder.layer.12.output.LayerNorm.weight", "encoder.layer.12.output.LayerNorm.bias", "encoder.layer.13.attention.self.query.weight", "encoder.layer.13.attention.self.query.bias", "encoder.layer.13.attention.self.key.weight", "encoder.layer.13.attention.self.key.bias", "encoder.layer.13.attention.self.value.weight", "encoder.layer.13.attention.self.value.bias", "encoder.layer.13.attention.output.dense.weight", "encoder.layer.13.attention.output.dense.bias", "encoder.layer.13.attention.output.LayerNorm.weight", "encoder.layer.13.attention.output.LayerNorm.bias", "encoder.layer.13.intermediate.dense.weight", "encoder.layer.13.intermediate.dense.bias", "encoder.layer.13.output.dense.weight", "encoder.layer.13.output.dense.bias", "encoder.layer.13.output.LayerNorm.weight", "encoder.layer.13.output.LayerNorm.bias", "encoder.layer.14.attention.self.query.weight", "encoder.layer.14.attention.self.query.bias", "encoder.layer.14.attention.self.key.weight", "encoder.layer.14.attention.self.key.bias", "encoder.layer.14.attention.self.value.weight", "encoder.layer.14.attention.self.value.bias", "encoder.layer.14.attention.output.dense.weight", "encoder.layer.14.attention.output.dense.bias", "encoder.layer.14.attention.output.LayerNorm.weight", "encoder.layer.14.attention.output.LayerNorm.bias", "encoder.layer.14.intermediate.dense.weight", "encoder.layer.14.intermediate.dense.bias", "encoder.layer.14.output.dense.weight", "encoder.layer.14.output.dense.bias", "encoder.layer.14.output.LayerNorm.weight", "encoder.layer.14.output.LayerNorm.bias", "encoder.layer.15.attention.self.query.weight", "encoder.layer.15.attention.self.query.bias", "encoder.layer.15.attention.self.key.weight", "encoder.layer.15.attention.self.key.bias", "encoder.layer.15.attention.self.value.weight", "encoder.layer.15.attention.self.value.bias", "encoder.layer.15.attention.output.dense.weight", "encoder.layer.15.attention.output.dense.bias", "encoder.layer.15.attention.output.LayerNorm.weight", "encoder.layer.15.attention.output.LayerNorm.bias", "encoder.layer.15.intermediate.dense.weight", "encoder.layer.15.intermediate.dense.bias", "encoder.layer.15.output.dense.weight", "encoder.layer.15.output.dense.bias", "encoder.layer.15.output.LayerNorm.weight", "encoder.layer.15.output.LayerNorm.bias", "encoder.layer.16.attention.self.query.weight", "encoder.layer.16.attention.self.query.bias", "encoder.layer.16.attention.self.key.weight", "encoder.layer.16.attention.self.key.bias", "encoder.layer.16.attention.self.value.weight", "encoder.layer.16.attention.self.value.bias", "encoder.layer.16.attention.output.dense.weight", "encoder.layer.16.attention.output.dense.bias", "encoder.layer.16.attention.output.LayerNorm.weight", "encoder.layer.16.attention.output.LayerNorm.bias", "encoder.layer.16.intermediate.dense.weight", "encoder.layer.16.intermediate.dense.bias", "encoder.layer.16.output.dense.weight", "encoder.layer.16.output.dense.bias", "encoder.layer.16.output.LayerNorm.weight", "encoder.layer.16.output.LayerNorm.bias", "encoder.layer.17.attention.self.query.weight", "encoder.layer.17.attention.self.query.bias", "encoder.layer.17.attention.self.key.weight", "encoder.layer.17.attention.self.key.bias", "encoder.layer.17.attention.self.value.weight", "encoder.layer.17.attention.self.value.bias", "encoder.layer.17.attention.output.dense.weight", "encoder.layer.17.attention.output.dense.bias", "encoder.layer.17.attention.output.LayerNorm.weight", "encoder.layer.17.attention.output.LayerNorm.bias", "encoder.layer.17.intermediate.dense.weight", "encoder.layer.17.intermediate.dense.bias", "encoder.layer.17.output.dense.weight", "encoder.layer.17.output.dense.bias", "encoder.layer.17.output.LayerNorm.weight", "encoder.layer.17.output.LayerNorm.bias", "encoder.layer.18.attention.self.query.weight", "encoder.layer.18.attention.self.query.bias", "encoder.layer.18.attention.self.key.weight", "encoder.layer.18.attention.self.key.bias", "encoder.layer.18.attention.self.value.weight", "encoder.layer.18.attention.self.value.bias", "encoder.layer.18.attention.output.dense.weight", "encoder.layer.18.attention.output.dense.bias", "encoder.layer.18.attention.output.LayerNorm.weight", "encoder.layer.18.attention.output.LayerNorm.bias", "encoder.layer.18.intermediate.dense.weight", "encoder.layer.18.intermediate.dense.bias", "encoder.layer.18.output.dense.weight", "encoder.layer.18.output.dense.bias", "encoder.layer.18.output.LayerNorm.weight", "encoder.layer.18.output.LayerNorm.bias", "encoder.layer.19.attention.self.query.weight", "encoder.layer.19.attention.self.query.bias", "encoder.layer.19.attention.self.key.weight", "encoder.layer.19.attention.self.key.bias", "encoder.layer.19.attention.self.value.weight", "encoder.layer.19.attention.self.value.bias", "encoder.layer.19.attention.output.dense.weight", "encoder.layer.19.attention.output.dense.bias", "encoder.layer.19.attention.output.LayerNorm.weight", "encoder.layer.19.attention.output.LayerNorm.bias", "encoder.layer.19.intermediate.dense.weight", "encoder.layer.19.intermediate.dense.bias", "encoder.layer.19.output.dense.weight", "encoder.layer.19.output.dense.bias", "encoder.layer.19.output.LayerNorm.weight", "encoder.layer.19.output.LayerNorm.bias", "encoder.layer.20.attention.self.query.weight", "encoder.layer.20.attention.self.query.bias", "encoder.layer.20.attention.self.key.weight", "encoder.layer.20.attention.self.key.bias", "encoder.layer.20.attention.self.value.weight", "encoder.layer.20.attention.self.value.bias", "encoder.layer.20.attention.output.dense.weight", "encoder.layer.20.attention.output.dense.bias", "encoder.layer.20.attention.output.LayerNorm.weight", "encoder.layer.20.attention.output.LayerNorm.bias", "encoder.layer.20.intermediate.dense.weight", "encoder.layer.20.intermediate.dense.bias", "encoder.layer.20.output.dense.weight", "encoder.layer.20.output.dense.bias", "encoder.layer.20.output.LayerNorm.weight", "encoder.layer.20.output.LayerNorm.bias", "encoder.layer.21.attention.self.query.weight", "encoder.layer.21.attention.self.query.bias", "encoder.layer.21.attention.self.key.weight", "encoder.layer.21.attention.self.key.bias", "encoder.layer.21.attention.self.value.weight", "encoder.layer.21.attention.self.value.bias", "encoder.layer.21.attention.output.dense.weight", "encoder.layer.21.attention.output.dense.bias", "encoder.layer.21.attention.output.LayerNorm.weight", "encoder.layer.21.attention.output.LayerNorm.bias", "encoder.layer.21.intermediate.dense.weight", "encoder.layer.21.intermediate.dense.bias", "encoder.layer.21.output.dense.weight", "encoder.layer.21.output.dense.bias", "encoder.layer.21.output.LayerNorm.weight", "encoder.layer.21.output.LayerNorm.bias", "encoder.layer.22.attention.self.query.weight", "encoder.layer.22.attention.self.query.bias", "encoder.layer.22.attention.self.key.weight", "encoder.layer.22.attention.self.key.bias", "encoder.layer.22.attention.self.value.weight", "encoder.layer.22.attention.self.value.bias", "encoder.layer.22.attention.output.dense.weight", "encoder.layer.22.attention.output.dense.bias", "encoder.layer.22.attention.output.LayerNorm.weight", "encoder.layer.22.attention.output.LayerNorm.bias", "encoder.layer.22.intermediate.dense.weight", "encoder.layer.22.intermediate.dense.bias", "encoder.layer.22.output.dense.weight", "encoder.layer.22.output.dense.bias", "encoder.layer.22.output.LayerNorm.weight", "encoder.layer.22.output.LayerNorm.bias", "encoder.layer.23.attention.self.query.weight", "encoder.layer.23.attention.self.query.bias", "encoder.layer.23.attention.self.key.weight", "encoder.layer.23.attention.self.key.bias", "encoder.layer.23.attention.self.value.weight", "encoder.layer.23.attention.self.value.bias", "encoder.layer.23.attention.output.dense.weight", "encoder.layer.23.attention.output.dense.bias", "encoder.layer.23.attention.output.LayerNorm.weight", "encoder.layer.23.attention.output.LayerNorm.bias", "encoder.layer.23.intermediate.dense.weight", "encoder.layer.23.intermediate.dense.bias", "encoder.layer.23.output.dense.weight", "encoder.layer.23.output.dense.bias", "encoder.layer.23.output.LayerNorm.weight", "encoder.layer.23.output.LayerNorm.bias", "pooler.dense.weight", "pooler.dense.bias", "cls.predictions.bias", "cls.predictions.transform.dense.weight", "cls.predictions.transform.dense.bias", "cls.predictions.transform.LayerNorm.weight", "cls.predictions.transform.LayerNorm.bias", "cls.seq_relationship.weight", "cls.seq_relationship.bias"] \ No newline at end of file diff --git a/config/bert_base_2layer_2conect.json b/config/bert_base_2layer_2conect.json new file mode 100644 index 0000000..cb27489 --- /dev/null +++ b/config/bert_base_2layer_2conect.json @@ -0,0 +1,30 @@ +{ + "attention_probs_dropout_prob": 0.1, + "hidden_act": "gelu", + "hidden_dropout_prob": 0.1, + "hidden_size": 768, + "initializer_range": 0.02, + "intermediate_size": 3072, + "max_position_embeddings": 512, + "num_attention_heads": 12, + "num_hidden_layers": 12, + "type_vocab_size": 2, + "vocab_size": 30522, + "v_feature_size": 2048, + "v_target_size": 1601, + "v_hidden_size": 1024, + "v_num_hidden_layers":2, + "v_num_attention_heads":8, + "v_intermediate_size":1024, + "bi_hidden_size":1024, + "bi_num_attention_heads":8, + "bi_intermediate_size": 1024, + "bi_attention_type":1, + "v_attention_probs_dropout_prob":0.1, + "v_hidden_act":"gelu", + "v_hidden_dropout_prob":0.1, + "v_initializer_range":0.02, + "v_biattention_id":[0, 1], + "t_biattention_id":[10, 11], + "pooling_method": "mul" +} diff --git a/config/bert_base_4layer_4conect.json b/config/bert_base_4layer_4conect.json new file mode 100644 index 0000000..e0dd01f --- /dev/null +++ b/config/bert_base_4layer_4conect.json @@ -0,0 +1,30 @@ +{ + "attention_probs_dropout_prob": 0.1, + "hidden_act": "gelu", + "hidden_dropout_prob": 0.1, + "hidden_size": 768, + "initializer_range": 0.02, + "intermediate_size": 3072, + "max_position_embeddings": 512, + "num_attention_heads": 12, + "num_hidden_layers": 12, + "type_vocab_size": 2, + "vocab_size": 30522, + "v_feature_size": 2048, + "v_target_size": 1601, + "v_hidden_size": 1024, + "v_num_hidden_layers":4, + "v_num_attention_heads":8, + "v_intermediate_size":1024, + "bi_hidden_size":1024, + "bi_num_attention_heads":8, + "bi_intermediate_size": 1024, + "bi_attention_type":1, + "v_attention_probs_dropout_prob":0.1, + "v_hidden_act":"gelu", + "v_hidden_dropout_prob":0.1, + "v_initializer_range":0.02, + "v_biattention_id":[0, 1, 2, 3], + "t_biattention_id":[8, 9, 10, 11], + "pooling_method": "mul" +} diff --git a/config/bert_base_6layer_6conect.json b/config/bert_base_6layer_6conect.json new file mode 100644 index 0000000..e78fc0c --- /dev/null +++ b/config/bert_base_6layer_6conect.json @@ -0,0 +1,30 @@ +{ + "attention_probs_dropout_prob": 0.1, + "hidden_act": "gelu", + "hidden_dropout_prob": 0.1, + "hidden_size": 768, + "initializer_range": 0.02, + "intermediate_size": 3072, + "max_position_embeddings": 512, + "num_attention_heads": 12, + "num_hidden_layers": 12, + "type_vocab_size": 2, + "vocab_size": 30522, + "v_feature_size": 2048, + "v_target_size": 1601, + "v_hidden_size": 1024, + "v_num_hidden_layers":6, + "v_num_attention_heads":8, + "v_intermediate_size":1024, + "bi_hidden_size":1024, + "bi_num_attention_heads":8, + "bi_intermediate_size": 1024, + "bi_attention_type":1, + "v_attention_probs_dropout_prob":0.1, + "v_hidden_act":"gelu", + "v_hidden_dropout_prob":0.1, + "v_initializer_range":0.02, + "v_biattention_id":[0, 1, 2, 3, 4, 5], + "t_biattention_id":[6, 7, 8, 9, 10, 11], + "pooling_method": "mul" +} diff --git a/config/bert_base_8layer_8conect.json b/config/bert_base_8layer_8conect.json new file mode 100644 index 0000000..b5f3571 --- /dev/null +++ b/config/bert_base_8layer_8conect.json @@ -0,0 +1,30 @@ +{ + "attention_probs_dropout_prob": 0.1, + "hidden_act": "gelu", + "hidden_dropout_prob": 0.1, + "hidden_size": 768, + "initializer_range": 0.02, + "intermediate_size": 3072, + "max_position_embeddings": 512, + "num_attention_heads": 12, + "num_hidden_layers": 12, + "type_vocab_size": 2, + "vocab_size": 30522, + "v_feature_size": 2048, + "v_target_size": 1601, + "v_hidden_size": 1024, + "v_num_hidden_layers":8, + "v_num_attention_heads":8, + "v_intermediate_size":1024, + "bi_hidden_size":1024, + "bi_num_attention_heads":8, + "bi_intermediate_size": 1024, + "bi_attention_type":1, + "v_attention_probs_dropout_prob":0.1, + "v_hidden_act":"gelu", + "v_hidden_dropout_prob":0.1, + "v_initializer_range":0.02, + "v_biattention_id":[0, 1, 2, 3, 4, 5, 6, 7], + "t_biattention_id":[4, 5, 6, 7, 8, 9, 10, 11], + "pooling_method": "mul" +} diff --git a/config/bert_base_baseline.json b/config/bert_base_baseline.json new file mode 100644 index 0000000..fca794a --- /dev/null +++ b/config/bert_base_baseline.json @@ -0,0 +1,13 @@ +{ + "attention_probs_dropout_prob": 0.1, + "hidden_act": "gelu", + "hidden_dropout_prob": 0.1, + "hidden_size": 768, + "initializer_range": 0.02, + "intermediate_size": 3072, + "max_position_embeddings": 512, + "num_attention_heads": 12, + "num_hidden_layers": 12, + "type_vocab_size": 2, + "vocab_size": 30522 +} diff --git a/config/bert_large_2layer_2conect.json b/config/bert_large_2layer_2conect.json new file mode 100644 index 0000000..f83a8a9 --- /dev/null +++ b/config/bert_large_2layer_2conect.json @@ -0,0 +1,30 @@ +{ + "attention_probs_dropout_prob": 0.1, + "hidden_act": "gelu", + "hidden_dropout_prob": 0.1, + "hidden_size": 1024, + "initializer_range": 0.02, + "intermediate_size": 4096, + "max_position_embeddings": 512, + "num_attention_heads": 16, + "num_hidden_layers": 24, + "type_vocab_size": 2, + "vocab_size": 30522, + "v_feature_size": 2048, + "v_target_size": 1601, + "v_hidden_size":1024, + "v_num_hidden_layers":2, + "v_num_attention_heads":8, + "v_intermediate_size":1024, + "bi_hidden_size":1024, + "bi_num_attention_heads":8, + "bi_intermediate_size": 1024, + "bi_attention_type":1, + "v_attention_probs_dropout_prob":0.1, + "v_hidden_act":"gelu", + "v_hidden_dropout_prob":0.1, + "v_initializer_range":0.02, + "v_biattention_id":[0, 1], + "t_biattention_id":[22, 23], + "pooling_method": "mul" +} \ No newline at end of file diff --git a/config/bert_large_4layer_4conect.json b/config/bert_large_4layer_4conect.json new file mode 100644 index 0000000..39cbbd5 --- /dev/null +++ b/config/bert_large_4layer_4conect.json @@ -0,0 +1,30 @@ +{ + "attention_probs_dropout_prob": 0.1, + "hidden_act": "gelu", + "hidden_dropout_prob": 0.1, + "hidden_size": 1024, + "initializer_range": 0.02, + "intermediate_size": 4096, + "max_position_embeddings": 512, + "num_attention_heads": 16, + "num_hidden_layers": 24, + "type_vocab_size": 2, + "vocab_size": 30522, + "v_feature_size": 2048, + "v_target_size": 1601, + "v_hidden_size":1024, + "v_num_hidden_layers":4, + "v_num_attention_heads":8, + "v_intermediate_size":1024, + "bi_hidden_size":1024, + "bi_num_attention_heads":8, + "bi_intermediate_size": 1024, + "bi_attention_type":1, + "v_attention_probs_dropout_prob":0.1, + "v_hidden_act":"gelu", + "v_hidden_dropout_prob":0.1, + "v_initializer_range":0.02, + "v_biattention_id":[0, 1, 2, 3], + "t_biattention_id":[20, 21, 22, 23], + "pooling_method": "mul" +} \ No newline at end of file diff --git a/config/bert_large_6layer_6conect.json b/config/bert_large_6layer_6conect.json new file mode 100644 index 0000000..db6c0ea --- /dev/null +++ b/config/bert_large_6layer_6conect.json @@ -0,0 +1,30 @@ +{ + "attention_probs_dropout_prob": 0.1, + "hidden_act": "gelu", + "hidden_dropout_prob": 0.1, + "hidden_size": 1024, + "initializer_range": 0.02, + "intermediate_size": 4096, + "max_position_embeddings": 512, + "num_attention_heads": 16, + "num_hidden_layers": 24, + "type_vocab_size": 2, + "vocab_size": 30522, + "v_feature_size": 2048, + "v_target_size": 1601, + "v_hidden_size":1024, + "v_num_hidden_layers":6, + "v_num_attention_heads":8, + "v_intermediate_size":1024, + "bi_hidden_size":1024, + "bi_num_attention_heads":8, + "bi_intermediate_size": 1024, + "bi_attention_type":1, + "v_attention_probs_dropout_prob":0.1, + "v_hidden_act":"gelu", + "v_hidden_dropout_prob":0.1, + "v_initializer_range":0.02, + "v_biattention_id":[0, 1, 2, 3, 4, 5], + "t_biattention_id":[18, 19, 20, 21, 22, 23], + "pooling_method": "mul" +} \ No newline at end of file diff --git a/config/bert_large_baseline.json b/config/bert_large_baseline.json new file mode 100644 index 0000000..a7efa97 --- /dev/null +++ b/config/bert_large_baseline.json @@ -0,0 +1,13 @@ +{ + "attention_probs_dropout_prob": 0.1, + "hidden_act": "gelu", + "hidden_dropout_prob": 0.1, + "hidden_size": 1024, + "initializer_range": 0.02, + "intermediate_size": 4096, + "max_position_embeddings": 512, + "num_attention_heads": 16, + "num_hidden_layers": 24, + "type_vocab_size": 2, + "vocab_size": 30522 +} diff --git a/eval_tasks.py b/eval_tasks.py new file mode 100644 index 0000000..3a87c01 --- /dev/null +++ b/eval_tasks.py @@ -0,0 +1,235 @@ +import argparse +import json +import logging +import os +import random +from io import open +import numpy as np + +from tensorboardX import SummaryWriter +from tqdm import tqdm +from bisect import bisect +import yaml +from easydict import EasyDict as edict +import sys +import pdb + +import torch +import torch.nn.functional as F +import torch.nn as nn + +from pytorch_pretrained_bert.optimization import BertAdam, WarmupLinearSchedule + +from vilbert.task_utils import LoadDatasetEval, LoadLosses, ForwardModelsTrain, ForwardModelsVal, EvaluatingModel + +import vilbert.utils as utils +import torch.distributed as dist + +logging.basicConfig( + format="%(asctime)s - %(levelname)s - %(name)s - %(message)s", + datefmt="%m/%d/%Y %H:%M:%S", + level=logging.INFO, +) +logger = logging.getLogger(__name__) + +def main(): + parser = argparse.ArgumentParser() + + parser.add_argument( + "--bert_model", + default="bert-base-uncased", + type=str, + help="Bert pre-trained model selected in the list: bert-base-uncased, " + "bert-large-uncased, bert-base-cased, bert-base-multilingual, bert-base-chinese.", + ) + parser.add_argument( + "--from_pretrained", + default="bert-base-uncased", + type=str, + help="Bert pre-trained model selected in the list: bert-base-uncased, " + "bert-large-uncased, bert-base-cased, bert-base-multilingual, bert-base-chinese.", + ) + parser.add_argument( + "--output_dir", + default="results", + type=str, + help="The output directory where the model checkpoints will be written.", + ) + parser.add_argument( + "--config_file", + default="config/bert_config.json", + type=str, + help="The config file which specified the model details.", + ) + parser.add_argument( + "--no_cuda", action="store_true", help="Whether not to use CUDA when available" + ) + parser.add_argument( + "--do_lower_case", + default=True, + type=bool, + help="Whether to lower case the input text. True for uncased models, False for cased models.", + ) + parser.add_argument( + "--local_rank", type=int, default=-1, help="local_rank for distributed training on gpus" + ) + parser.add_argument("--seed", type=int, default=42, help="random seed for initialization") + parser.add_argument( + "--fp16", + action="store_true", + help="Whether to use 16-bit float precision instead of 32-bit", + ) + parser.add_argument( + "--loss_scale", + type=float, + default=0, + help="Loss scaling to improve fp16 numeric stability. Only used when fp16 set to True.\n" + "0 (default value): dynamic loss scaling.\n" + "Positive power of 2: static loss scaling value.\n", + ) + parser.add_argument( + "--num_workers", type=int, default=16, help="Number of workers in the dataloader." + ) + parser.add_argument( + "--save_name", + default='', + type=str, + help="save name for training.", + ) + parser.add_argument( + "--use_chunk", default=0, type=float, help="whether use chunck for parallel training." + ) + parser.add_argument( + "--batch_size", default=1024, type=int, help="what is the batch size?" + ) + parser.add_argument( + "--tasks", default='', type=str, help="1-2-3... training task separate by -" + ) + parser.add_argument( + "--in_memory", default=False, type=bool, help="whether use chunck for parallel training." + ) + parser.add_argument( + "--baseline", action="store_true", help="whether use single stream baseline." + ) + parser.add_argument( + "--split", default="", type=str, help="which split to use." + ) + + args = parser.parse_args() + with open('vlbert_tasks.yml', 'r') as f: + task_cfg = edict(yaml.load(f)) + + random.seed(args.seed) + np.random.seed(args.seed) + torch.manual_seed(args.seed) + + if args.baseline: + from pytorch_pretrained_bert.modeling import BertConfig + from vilbert.basebert import BaseBertForVLTasks + else: + from vilbert.vilbert import BertConfig + from vilbert.vilbert import VILBertForVLTasks + + task_names = [] + for i, task_id in enumerate(args.tasks.split('-')): + task = 'TASK' + task_id + name = task_cfg[task]['name'] + task_names.append(name) + + # timeStamp = '-'.join(task_names) + '_' + args.config_file.split('/')[1].split('.')[0] + timeStamp = args.from_pretrained.split('/')[1] + '-' + args.save_name + savePath = os.path.join(args.output_dir, timeStamp) + + config = BertConfig.from_json_file(args.config_file) + bert_weight_name = json.load(open("config/" + args.bert_model + "_weight_name.json", "r")) + + if args.local_rank == -1 or args.no_cuda: + device = torch.device("cuda" if torch.cuda.is_available() and not args.no_cuda else "cpu") + n_gpu = torch.cuda.device_count() + else: + torch.cuda.set_device(args.local_rank) + device = torch.device("cuda", args.local_rank) + n_gpu = 1 + # Initializes the distributed backend which will take care of sychronizing nodes/GPUs + torch.distributed.init_process_group(backend="nccl") + + logger.info( + "device: {} n_gpu: {}, distributed training: {}, 16-bits training: {}".format( + device, n_gpu, bool(args.local_rank != -1), args.fp16 + ) + ) + + default_gpu = False + if dist.is_available() and args.local_rank != -1: + rank = dist.get_rank() + if rank == 0: + default_gpu = True + else: + default_gpu = True + + if default_gpu and not os.path.exists(savePath): + os.makedirs(savePath) + + task_batch_size, task_num_iters, task_ids, task_datasets_val, task_dataloader_val \ + = LoadDatasetEval(args, task_cfg, args.tasks.split('-')) + + tbLogger = utils.tbLogger(timeStamp, savePath, task_names, task_ids, task_num_iters, 1, save_logger=False, txt_name='eval.txt') + + num_labels = max([dataset.num_labels for dataset in task_datasets_val.values()]) + + if args.baseline: + model = BaseBertForVLTasks.from_pretrained( + args.from_pretrained, config, num_labels=num_labels, default_gpu=default_gpu + ) + else: + model = VILBertForVLTasks.from_pretrained( + args.from_pretrained, config, num_labels=num_labels, default_gpu=default_gpu + ) + + task_losses = LoadLosses(args, task_cfg, args.tasks.split('-')) + model.to(device) + if args.local_rank != -1: + try: + from apex.parallel import DistributedDataParallel as DDP + except ImportError: + raise ImportError( + "Please install apex from https://www.github.com/nvidia/apex to use distributed and fp16 training." + ) + model = DDP(model, delay_allreduce=True) + + elif n_gpu > 1: + model = nn.DataParallel(model) + + no_decay = ["bias", "LayerNorm.bias", "LayerNorm.weight"] + + print("***** Running training *****") + print(" Num Iters: ", task_num_iters) + print(" Batch size: ", task_batch_size) + + model.eval() + # when run evaluate, we run each task sequentially. + for task_id in task_ids: + results = [] + others = [] + for i, batch in enumerate(task_dataloader_val[task_id]): + loss, score, batch_size, results, others = EvaluatingModel(args, task_cfg, device, \ + task_id, batch, model, task_dataloader_val, task_losses, results, others) + + tbLogger.step_val(0, float(loss), float(score), task_id, batch_size, 'val') + + sys.stdout.write('%d/%d\r' % (i, len(task_dataloader_val[task_id]))) + sys.stdout.flush() + # save the result or evaluate the result. + ave_score = tbLogger.showLossVal() + + if args.split: + json_path = os.path.join(savePath, args.split) + else: + json_path = os.path.join(savePath, task_cfg[task_id]['val_split']) + + json.dump(results, open(json_path+ '_result.json', 'w')) + json.dump(others, open(json_path+ '_others.json', 'w')) + +if __name__ == "__main__": + + main() diff --git a/parallel/data_parallel.py b/parallel/data_parallel.py new file mode 100644 index 0000000..8553457 --- /dev/null +++ b/parallel/data_parallel.py @@ -0,0 +1,226 @@ +import operator +import torch +import warnings +from itertools import chain +from torch.nn import Module +from torch.nn.parallel.replicate import replicate +from torch.nn.parallel.parallel_apply import parallel_apply +from torch.cuda._utils import _get_device_index + +from parallel.scatter_gather import scatter_kwargs, gather + +def _check_balance(device_ids): + imbalance_warn = """ + There is an imbalance between your GPUs. You may want to exclude GPU {} which + has less than 75% of the memory or cores of GPU {}. You can do so by setting + the device_ids argument to DataParallel, or by setting the CUDA_VISIBLE_DEVICES + environment variable.""" + device_ids = list(map(lambda x: _get_device_index(x, True), device_ids)) + dev_props = [torch.cuda.get_device_properties(i) for i in device_ids] + + def warn_imbalance(get_prop): + values = [get_prop(props) for props in dev_props] + min_pos, min_val = min(enumerate(values), key=operator.itemgetter(1)) + max_pos, max_val = max(enumerate(values), key=operator.itemgetter(1)) + if min_val / max_val < 0.75: + warnings.warn(imbalance_warn.format(device_ids[min_pos], device_ids[max_pos])) + return True + return False + + if warn_imbalance(lambda props: props.total_memory): + return + if warn_imbalance(lambda props: props.multi_processor_count): + return + + +class DataParallel(Module): + r"""Implements data parallelism at the module level. + + This container parallelizes the application of the given :attr:`module` by + splitting the input across the specified devices by chunking in the batch + dimension (other objects will be copied once per device). In the forward + pass, the module is replicated on each device, and each replica handles a + portion of the input. During the backwards pass, gradients from each replica + are summed into the original module. + + The batch size should be larger than the number of GPUs used. + + See also: :ref:`cuda-nn-dataparallel-instead` + + Arbitrary positional and keyword inputs are allowed to be passed into + DataParallel but some types are specially handled. tensors will be + **scattered** on dim specified (default 0). tuple, list and dict types will + be shallow copied. The other types will be shared among different threads + and can be corrupted if written to in the model's forward pass. + + The parallelized :attr:`module` must have its parameters and buffers on + ``device_ids[0]`` before running this :class:`~torch.nn.DataParallel` + module. + + .. warning:: + In each forward, :attr:`module` is **replicated** on each device, so any + updates to the running module in ``forward`` will be lost. For example, + if :attr:`module` has a counter attribute that is incremented in each + ``forward``, it will always stay at the initial value because the update + is done on the replicas which are destroyed after ``forward``. However, + :class:`~torch.nn.DataParallel` guarantees that the replica on + ``device[0]`` will have its parameters and buffers sharing storage with + the base parallelized :attr:`module`. So **in-place** updates to the + parameters or buffers on ``device[0]`` will be recorded. E.g., + :class:`~torch.nn.BatchNorm2d` and :func:`~torch.nn.utils.spectral_norm` + rely on this behavior to update the buffers. + + .. warning:: + Forward and backward hooks defined on :attr:`module` and its submodules + will be invoked ``len(device_ids)`` times, each with inputs located on + a particular device. Particularly, the hooks are only guaranteed to be + executed in correct order with respect to operations on corresponding + devices. For example, it is not guaranteed that hooks set via + :meth:`~torch.nn.Module.register_forward_pre_hook` be executed before + `all` ``len(device_ids)`` :meth:`~torch.nn.Module.forward` calls, but + that each such hook be executed before the corresponding + :meth:`~torch.nn.Module.forward` call of that device. + + .. warning:: + When :attr:`module` returns a scalar (i.e., 0-dimensional tensor) in + :func:`forward`, this wrapper will return a vector of length equal to + number of devices used in data parallelism, containing the result from + each device. + + .. note:: + There is a subtlety in using the + ``pack sequence -> recurrent network -> unpack sequence`` pattern in a + :class:`~torch.nn.Module` wrapped in :class:`~torch.nn.DataParallel`. + See :ref:`pack-rnn-unpack-with-data-parallelism` section in FAQ for + details. + + + Args: + module (Module): module to be parallelized + device_ids (list of int or torch.device): CUDA devices (default: all devices) + output_device (int or torch.device): device location of output (default: device_ids[0]) + + Attributes: + module (Module): the module to be parallelized + + Example:: + + >>> net = torch.nn.DataParallel(model, device_ids=[0, 1, 2]) + >>> output = net(input_var) # input_var can be on any device, including CPU + """ + + # TODO: update notes/cuda.rst when this class handles 8+ GPUs well + + def __init__(self, module, device_ids=None, output_device=None, dim=0, use_chuncks=0): + super(DataParallel, self).__init__() + + if not torch.cuda.is_available(): + self.module = module + self.device_ids = [] + return + + if device_ids is None: + device_ids = list(range(torch.cuda.device_count())) + if output_device is None: + output_device = device_ids[0] + + self.dim = dim + self.module = module + self.device_ids = list(map(lambda x: _get_device_index(x, True), device_ids)) + self.output_device = _get_device_index(output_device, True) + self.src_device_obj = torch.device("cuda:{}".format(self.device_ids[0])) + self.use_chuncks = use_chuncks + + _check_balance(self.device_ids) + + if len(self.device_ids) == 1: + self.module.cuda(device_ids[0]) + + def forward(self, *inputs, **kwargs): + if not self.device_ids: + return self.module(*inputs, **kwargs) + + for t in chain(self.module.parameters(), self.module.buffers()): + if t.device != self.src_device_obj: + raise RuntimeError("module must have its parameters and buffers " + "on device {} (device_ids[0]) but found one of " + "them on device: {}".format(self.src_device_obj, t.device)) + + if self.use_chuncks > 0: + # we want to assign 0.6 of the batch size to first gpu + batch_size = inputs[0].size(0) + chunk_sizes = [0] * len(self.device_ids) + # first gpu is using only 0.6 + chunk_sizes[0] = int(((batch_size / len(self.device_ids)) * self.use_chuncks)) + rest_batch = batch_size - chunk_sizes[0] + r_bs = int(rest_batch / (len(self.device_ids)-1))+1 + + for i in range(1, len(self.device_ids)): + if i == len(self.device_ids)-1: + chunk_sizes[i] = batch_size - sum(chunk_sizes) + else: + chunk_sizes[i] = r_bs + else: + chunk_sizes = None + + inputs, kwargs = self.scatter(inputs, kwargs, self.device_ids, chunk_sizes) + if len(self.device_ids) == 1: + return self.module(*inputs[0], **kwargs[0]) + replicas = self.replicate(self.module, self.device_ids[:len(inputs)]) + outputs = self.parallel_apply(replicas, inputs, kwargs) + return self.gather(outputs, self.output_device) + + def replicate(self, module, device_ids): + return replicate(module, device_ids) + + def scatter(self, inputs, kwargs, device_ids, chunk_sizes): + return scatter_kwargs(inputs, kwargs, device_ids, dim=self.dim, chunk_sizes=chunk_sizes) + + def parallel_apply(self, replicas, inputs, kwargs): + return parallel_apply(replicas, inputs, kwargs, self.device_ids[:len(replicas)]) + + def gather(self, outputs, output_device): + return gather(outputs, output_device, dim=self.dim) + + +def data_parallel(module, inputs, device_ids=None, output_device=None, dim=0, module_kwargs=None): + r"""Evaluates module(input) in parallel across the GPUs given in device_ids. + + This is the functional version of the DataParallel module. + + Args: + module (Module): the module to evaluate in parallel + inputs (Tensor): inputs to the module + device_ids (list of int or torch.device): GPU ids on which to replicate module + output_device (list of int or torch.device): GPU location of the output Use -1 to indicate the CPU. + (default: device_ids[0]) + Returns: + a Tensor containing the result of module(input) located on + output_device + """ + if not isinstance(inputs, tuple): + inputs = (inputs,) + + if device_ids is None: + device_ids = list(range(torch.cuda.device_count())) + + if output_device is None: + output_device = device_ids[0] + + device_ids = list(map(lambda x: _get_device_index(x, True), device_ids)) + output_device = _get_device_index(output_device, True) + src_device_obj = torch.device("cuda:{}".format(device_ids[0])) + + for t in chain(module.parameters(), module.buffers()): + if t.device != src_device_obj: + raise RuntimeError("module must have its parameters and buffers " + "on device {} (device_ids[0]) but found one of " + "them on device: {}".format(src_device_obj, t.device)) + + inputs, module_kwargs = scatter_kwargs(inputs, module_kwargs, device_ids, dim) + if len(device_ids) == 1: + return module(*inputs[0], **module_kwargs[0]) + used_device_ids = device_ids[:len(inputs)] + replicas = replicate(module, used_device_ids) + outputs = parallel_apply(replicas, inputs, module_kwargs, used_device_ids) + return gather(outputs, output_device, dim) \ No newline at end of file diff --git a/parallel/parallel.py b/parallel/parallel.py new file mode 100755 index 0000000..91dd53f --- /dev/null +++ b/parallel/parallel.py @@ -0,0 +1,281 @@ +##+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +## Created by: Hang Zhang, Rutgers University, Email: zhang.hang@rutgers.edu +## Modified by Thomas Wolf, HuggingFace Inc., Email: thomas@huggingface.co +## Copyright (c) 2017-2018 +## +## This source code is licensed under the MIT-style license found in the +## LICENSE file in the root directory of this source tree +##+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +"""Encoding Data Parallel""" +import threading +import functools +import torch +from torch.autograd import Variable, Function +import torch.cuda.comm as comm +from torch.nn.parallel.data_parallel import DataParallel + +from torch.nn.parallel.parallel_apply import get_a_var +from torch.nn.parallel.scatter_gather import gather +from torch.nn.parallel._functions import ReduceAddCoalesced, Broadcast + +torch_ver = torch.__version__[:3] + +__all__ = ['allreduce', 'DataParallelModel', 'DataParallelCriterion', + 'patch_replication_callback'] + +def allreduce(*inputs): + """Cross GPU all reduce autograd operation for calculate mean and + variance in SyncBN. + """ + return AllReduce.apply(*inputs) + +class AllReduce(Function): + @staticmethod + def forward(ctx, num_inputs, *inputs): + ctx.num_inputs = num_inputs + ctx.target_gpus = [inputs[i].get_device() for i in range(0, len(inputs), num_inputs)] + inputs = [inputs[i:i + num_inputs] + for i in range(0, len(inputs), num_inputs)] + # sort before reduce sum + inputs = sorted(inputs, key=lambda i: i[0].get_device()) + results = comm.reduce_add_coalesced(inputs, ctx.target_gpus[0]) + outputs = comm.broadcast_coalesced(results, ctx.target_gpus) + return tuple([t for tensors in outputs for t in tensors]) + + @staticmethod + def backward(ctx, *inputs): + inputs = [i.data for i in inputs] + inputs = [inputs[i:i + ctx.num_inputs] + for i in range(0, len(inputs), ctx.num_inputs)] + results = comm.reduce_add_coalesced(inputs, ctx.target_gpus[0]) + outputs = comm.broadcast_coalesced(results, ctx.target_gpus) + return (None,) + tuple([Variable(t) for tensors in outputs for t in tensors]) + + +class Reduce(Function): + @staticmethod + def forward(ctx, *inputs): + ctx.target_gpus = [inputs[i].get_device() for i in range(len(inputs))] + inputs = sorted(inputs, key=lambda i: i.get_device()) + return comm.reduce_add(inputs) + + @staticmethod + def backward(ctx, gradOutput): + return Broadcast.apply(ctx.target_gpus, gradOutput) + +# class DistributedDataParallelModel(DistributedDataParallel): +# """Implements data parallelism at the module level for the DistributedDataParallel module. +# This container parallelizes the application of the given module by +# splitting the input across the specified devices by chunking in the +# batch dimension. +# In the forward pass, the module is replicated on each device, +# and each replica handles a portion of the input. During the backwards pass, +# gradients from each replica are summed into the original module. +# Note that the outputs are not gathered, please use compatible +# :class:`encoding.parallel.DataParallelCriterion`. +# The batch size should be larger than the number of GPUs used. It should +# also be an integer multiple of the number of GPUs so that each chunk is +# the same size (so that each GPU processes the same number of samples). +# Args: +# module: module to be parallelized +# device_ids: CUDA devices (default: all devices) +# Reference: +# Hang Zhang, Kristin Dana, Jianping Shi, Zhongyue Zhang, Xiaogang Wang, Ambrish Tyagi, +# Amit Agrawal. “Context Encoding for Semantic Segmentation. +# *The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2018* +# Example:: +# >>> net = encoding.nn.DistributedDataParallelModel(model, device_ids=[0, 1, 2]) +# >>> y = net(x) +# """ +# def gather(self, outputs, output_device): +# return outputs + +class DataParallelModel(DataParallel): + """Implements data parallelism at the module level. + + This container parallelizes the application of the given module by + splitting the input across the specified devices by chunking in the + batch dimension. + In the forward pass, the module is replicated on each device, + and each replica handles a portion of the input. During the backwards pass, + gradients from each replica are summed into the original module. + Note that the outputs are not gathered, please use compatible + :class:`encoding.parallel.DataParallelCriterion`. + + The batch size should be larger than the number of GPUs used. It should + also be an integer multiple of the number of GPUs so that each chunk is + the same size (so that each GPU processes the same number of samples). + + Args: + module: module to be parallelized + device_ids: CUDA devices (default: all devices) + + Reference: + Hang Zhang, Kristin Dana, Jianping Shi, Zhongyue Zhang, Xiaogang Wang, Ambrish Tyagi, + Amit Agrawal. “Context Encoding for Semantic Segmentation. + *The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2018* + + Example:: + + >>> net = encoding.nn.DataParallelModel(model, device_ids=[0, 1, 2]) + >>> y = net(x) + """ + def gather(self, outputs, output_device): + return outputs + + def replicate(self, module, device_ids): + modules = super(DataParallelModel, self).replicate(module, device_ids) + execute_replication_callbacks(modules) + return modules + + +class DataParallelCriterion(DataParallel): + """ + Calculate loss in multiple-GPUs, which balance the memory usage. + The targets are splitted across the specified devices by chunking in + the batch dimension. Please use together with :class:`encoding.parallel.DataParallelModel`. + + Reference: + Hang Zhang, Kristin Dana, Jianping Shi, Zhongyue Zhang, Xiaogang Wang, Ambrish Tyagi, + Amit Agrawal. “Context Encoding for Semantic Segmentation. + *The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2018* + + Example:: + + >>> net = encoding.nn.DataParallelModel(model, device_ids=[0, 1, 2]) + >>> criterion = encoding.nn.DataParallelCriterion(criterion, device_ids=[0, 1, 2]) + >>> y = net(x) + >>> loss = criterion(y, target) + """ + def forward(self, inputs, *targets, **kwargs): + # input should be already scatterd + # scattering the targets instead + if not self.device_ids: + return self.module(inputs, *targets, **kwargs) + targets, kwargs = self.scatter(targets, kwargs, self.device_ids) + if len(self.device_ids) == 1: + return self.module(inputs, *targets[0], **kwargs[0]) + replicas = self.replicate(self.module, self.device_ids[:len(inputs)]) + outputs = _criterion_parallel_apply(replicas, inputs, targets, kwargs) + #return Reduce.apply(*outputs) / len(outputs) + #return self.gather(outputs, self.output_device).mean() + return self.gather(outputs, self.output_device) + + +def _criterion_parallel_apply(modules, inputs, targets, kwargs_tup=None, devices=None): + assert len(modules) == len(inputs) + assert len(targets) == len(inputs) + if kwargs_tup: + assert len(modules) == len(kwargs_tup) + else: + kwargs_tup = ({},) * len(modules) + if devices is not None: + assert len(modules) == len(devices) + else: + devices = [None] * len(modules) + + lock = threading.Lock() + results = {} + if torch_ver != "0.3": + grad_enabled = torch.is_grad_enabled() + + def _worker(i, module, input, target, kwargs, device=None): + if torch_ver != "0.3": + torch.set_grad_enabled(grad_enabled) + if device is None: + device = get_a_var(input).get_device() + try: + with torch.cuda.device(device): + # this also avoids accidental slicing of `input` if it is a Tensor + if not isinstance(input, (list, tuple)): + input = (input,) + if not isinstance(target, (list, tuple)): + target = (target,) + output = module(*(input + target), **kwargs) + with lock: + results[i] = output + except Exception as e: + with lock: + results[i] = e + + if len(modules) > 1: + threads = [threading.Thread(target=_worker, + args=(i, module, input, target, + kwargs, device),) + for i, (module, input, target, kwargs, device) in + enumerate(zip(modules, inputs, targets, kwargs_tup, devices))] + + for thread in threads: + thread.start() + for thread in threads: + thread.join() + else: + _worker(0, modules[0], inputs[0], kwargs_tup[0], devices[0]) + + outputs = [] + for i in range(len(inputs)): + output = results[i] + if isinstance(output, Exception): + raise output + outputs.append(output) + return outputs + + +########################################################################### +# Adapted from Synchronized-BatchNorm-PyTorch. +# https://github.com/vacancy/Synchronized-BatchNorm-PyTorch +# +class CallbackContext(object): + pass + + +def execute_replication_callbacks(modules): + """ + Execute an replication callback `__data_parallel_replicate__` on each module created + by original replication. + + The callback will be invoked with arguments `__data_parallel_replicate__(ctx, copy_id)` + + Note that, as all modules are isomorphism, we assign each sub-module with a context + (shared among multiple copies of this module on different devices). + Through this context, different copies can share some information. + + We guarantee that the callback on the master copy (the first copy) will be called ahead + of calling the callback of any slave copies. + """ + master_copy = modules[0] + nr_modules = len(list(master_copy.modules())) + ctxs = [CallbackContext() for _ in range(nr_modules)] + + for i, module in enumerate(modules): + for j, m in enumerate(module.modules()): + if hasattr(m, '__data_parallel_replicate__'): + m.__data_parallel_replicate__(ctxs[j], i) + + +def patch_replication_callback(data_parallel): + """ + Monkey-patch an existing `DataParallel` object. Add the replication callback. + Useful when you have customized `DataParallel` implementation. + + Examples: + > sync_bn = SynchronizedBatchNorm1d(10, eps=1e-5, affine=False) + > sync_bn = DataParallel(sync_bn, device_ids=[0, 1]) + > patch_replication_callback(sync_bn) + # this is equivalent to + > sync_bn = SynchronizedBatchNorm1d(10, eps=1e-5, affine=False) + > sync_bn = DataParallelWithCallback(sync_bn, device_ids=[0, 1]) + """ + + assert isinstance(data_parallel, DataParallel) + + old_replicate = data_parallel.replicate + + @functools.wraps(old_replicate) + def new_replicate(module, device_ids): + modules = old_replicate(module, device_ids) + execute_replication_callbacks(modules) + return modules + + data_parallel.replicate = new_replicate diff --git a/parallel/scatter_gather.py b/parallel/scatter_gather.py new file mode 100644 index 0000000..8900f8d --- /dev/null +++ b/parallel/scatter_gather.py @@ -0,0 +1,70 @@ +import torch +from torch.nn.parallel._functions import Scatter, Gather + + +def scatter(inputs, target_gpus, dim=0, chunk_sizes=None): + r""" + Slices tensors into approximately equal chunks and + distributes them across given GPUs. Duplicates + references to objects that are not tensors. + """ + def scatter_map(obj): + if isinstance(obj, torch.Tensor): + return Scatter.apply(target_gpus, chunk_sizes, dim, obj) + if isinstance(obj, tuple) and len(obj) > 0: + return list(zip(*map(scatter_map, obj))) + if isinstance(obj, list) and len(obj) > 0: + return list(map(list, zip(*map(scatter_map, obj)))) + if isinstance(obj, dict) and len(obj) > 0: + return list(map(type(obj), zip(*map(scatter_map, obj.items())))) + return [obj for targets in target_gpus] + + # After scatter_map is called, a scatter_map cell will exist. This cell + # has a reference to the actual function scatter_map, which has references + # to a closure that has a reference to the scatter_map cell (because the + # fn is recursive). To avoid this reference cycle, we set the function to + # None, clearing the cell + try: + return scatter_map(inputs) + finally: + scatter_map = None + + +def scatter_kwargs(inputs, kwargs, target_gpus, dim=0, chunk_sizes=None): + r"""Scatter with support for kwargs dictionary""" + inputs = scatter(inputs, target_gpus, dim, chunk_sizes) if inputs else [] + kwargs = scatter(kwargs, target_gpus, dim, chunk_sizes) if kwargs else [] + if len(inputs) < len(kwargs): + inputs.extend([() for _ in range(len(kwargs) - len(inputs))]) + elif len(kwargs) < len(inputs): + kwargs.extend([{} for _ in range(len(inputs) - len(kwargs))]) + inputs = tuple(inputs) + kwargs = tuple(kwargs) + + return inputs, kwargs + + +def gather(outputs, target_device, dim=0): + r""" + Gathers tensors from different GPUs on a specified device + (-1 means the CPU). + """ + def gather_map(outputs): + out = outputs[0] + if isinstance(out, torch.Tensor): + return Gather.apply(target_device, dim, *outputs) + if out is None: + return None + if isinstance(out, dict): + if not all((len(out) == len(d) for d in outputs)): + raise ValueError('All dicts must have the same number of keys') + return type(out)(((k, gather_map([d[k] for d in outputs])) + for k in out)) + return type(out)(map(gather_map, zip(*outputs))) + + # Recursive function calls like this create reference cycles. + # Setting the function to None clears the refcycle. + try: + return gather_map(outputs) + finally: + gather_map = None diff --git a/requirments.txt b/requirments.txt new file mode 100644 index 0000000..80caf37 --- /dev/null +++ b/requirments.txt @@ -0,0 +1,8 @@ +h5py==2.9.0 +pytorch-transformers +lmdb==0.94 +tensorboardX==1.2 +tensorflow==1.13.1 +tensorpack==0.9.4 +torch==1.1.0 +tqdm==4.31.1 \ No newline at end of file diff --git a/tools/__init__.py b/tools/__init__.py new file mode 100644 index 0000000..3f7d85b --- /dev/null +++ b/tools/__init__.py @@ -0,0 +1 @@ +__author__ = 'tylin' diff --git a/tools/refer/LICENSE b/tools/refer/LICENSE new file mode 100755 index 0000000..261eeb9 --- /dev/null +++ b/tools/refer/LICENSE @@ -0,0 +1,201 @@ + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "[]" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright [yyyy] [name of copyright owner] + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. diff --git a/tools/refer/Makefile b/tools/refer/Makefile new file mode 100644 index 0000000..7c6ea46 --- /dev/null +++ b/tools/refer/Makefile @@ -0,0 +1,6 @@ +all: + # install pycocotools/mask locally + # copy from https://github.com/pdollar/coco.git + python setup.py build_ext --inplace + rm -rf build + diff --git a/tools/refer/README.md b/tools/refer/README.md new file mode 100644 index 0000000..f024ca6 --- /dev/null +++ b/tools/refer/README.md @@ -0,0 +1,55 @@ +## Note +This API is able to load all 4 referring expression datasets, i.e., RefClef, RefCOCO, RefCOCO+ and RefCOCOg. +They are with different train/val/test split by UNC, Google and UC Berkeley respectively. We provide all kinds of splits here. + + + + +
Mountain View
+ +## Citation +If you used the following three datasets RefClef, RefCOCO and RefCOCO+ that were collected by UNC, please consider cite our EMNLP2014 paper; if you want to compare with our recent results, please check our ECCV2016 paper. +```bash +Kazemzadeh, Sahar, et al. "ReferItGame: Referring to Objects in Photographs of Natural Scenes." EMNLP 2014. +Yu, Licheng, et al. "Modeling Context in Referring Expressions." ECCV 2016. +``` + +## Setup +Run "make" before using the code. +It will generate ``_mask.c`` and ``_mask.so`` in ``external/`` folder. +These mask-related codes are copied from mscoco [API](https://github.com/pdollar/coco). + +## Download +Download the cleaned data and extract them into "data" folder +- 1) http://bvisionweb1.cs.unc.edu/licheng/referit/data/refclef.zip +- 2) http://bvisionweb1.cs.unc.edu/licheng/referit/data/refcoco.zip +- 3) http://bvisionweb1.cs.unc.edu/licheng/referit/data/refcoco+.zip +- 4) http://bvisionweb1.cs.unc.edu/licheng/referit/data/refcocog.zip + +## Prepare Images: +Besides, add "mscoco" into the ``data/images`` folder, which can be from [mscoco](http://mscoco.org/dataset/#overview) +COCO's images are used for RefCOCO, RefCOCO+ and refCOCOg. +For RefCLEF, please add ``saiapr_tc-12`` into ``data/images`` folder. We extracted the related 19997 images to our cleaned RefCLEF dataset, which is a subset of the original [imageCLEF](http://imageclef.org/SIAPRdata). Download the [subset](http://bvisionweb1.cs.unc.edu/licheng/referit/data/images/saiapr_tc-12.zip) and unzip it to ``data/images/saiapr_tc-12``. + +## How to use +The "refer.py" is able to load all 4 datasets with different kinds of data split by UNC, Google, UMD and UC Berkeley. +**Note for RefCOCOg, we suggest use UMD's split which has train/val/test splits and there is no overlap of images between different split.** +```bash +# locate your own data_root, and choose the dataset_splitBy you want to use +refer = REFER(data_root, dataset='refclef', splitBy='unc') +refer = REFER(data_root, dataset='refclef', splitBy='berkeley') # 2 train and 1 test images missed +refer = REFER(data_root, dataset='refcoco', splitBy='unc') +refer = REFER(data_root, dataset='refcoco', splitBy='google') +refer = REFER(data_root, dataset='refcoco+', splitBy='unc') +refer = REFER(data_root, dataset='refcocog', splitBy='google') # test split not released yet +refer = REFER(data_root, dataset='refcocog', splitBy='umd') # Recommended, including train/val/test +``` + + + diff --git a/tools/refer/__init__.py b/tools/refer/__init__.py new file mode 100644 index 0000000..3f7d85b --- /dev/null +++ b/tools/refer/__init__.py @@ -0,0 +1 @@ +__author__ = 'tylin' diff --git a/tools/refer/data/README.md b/tools/refer/data/README.md new file mode 100644 index 0000000..2684424 --- /dev/null +++ b/tools/refer/data/README.md @@ -0,0 +1,37 @@ +This directory should contain the following data: +``` +$DATA_PATH +├── images +│   ├── mscoco +│   └── saiaprtc12 +├── refcoco +│ ├── instances.json +│ ├── refs(google).p +│   └── refs(unc).p +├── refcoco+ +│ ├── instances.json +│   └── refs(unc).p +├── refcocog +│ ├── instances.json +│   └── refs(google).p +└── refclef + ├── instances.json + ├── refs(unc).p + └── refs(berkeley).p +``` + +Note, each detections/xxx.json contains +``{'dets': ['box': {x, y, w, h}, 'image_id', 'object_id', 'score']}``. The ``object_id`` and ``score`` might be missing, depending on what proposal/detection technique we are using. + +## Download +Download my cleaned data and extract them into this folder. +- 1) http://bvisionweb1.cs.unc.edu/licheng/referit/data/refclef.zip +- 2) http://bvisionweb1.cs.unc.edu/licheng/referit/data/refcoco.zip +- 3) http://bvisionweb1.cs.unc.edu/licheng/referit/data/refcoco+.zip +- 4) http://bvisionweb1.cs.unc.edu/licheng/referit/data/refcocog.zip + +Besides make a folder named as "images". +Add "mscoco" into "images/". +Download MSCOCO from [mscoco](http://mscoco.org/dataset/#overview) + +Add "saiapr_tc-12" into "images/". I only extracted the related images as a subset of the original [imageCLEF](http://imageclef.org/SIAPRdata), i.e., 19997 images. Please download the subset from here (http://bvisionweb1.cs.unc.edu/licheng/referit/data/images/saiapr_tc-12.zip). diff --git a/tools/refer/evaluation/__init__.py b/tools/refer/evaluation/__init__.py new file mode 100644 index 0000000..66e55e9 --- /dev/null +++ b/tools/refer/evaluation/__init__.py @@ -0,0 +1,3 @@ +__author__ = 'licheng' + + diff --git a/tools/refer/evaluation/bleu/LICENSE b/tools/refer/evaluation/bleu/LICENSE new file mode 100644 index 0000000..9ccf677 --- /dev/null +++ b/tools/refer/evaluation/bleu/LICENSE @@ -0,0 +1,19 @@ +Copyright (c) 2015 Xinlei Chen, Hao Fang, Tsung-Yi Lin, and Ramakrishna Vedantam + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/tools/refer/evaluation/bleu/__init__.py b/tools/refer/evaluation/bleu/__init__.py new file mode 100644 index 0000000..3f7d85b --- /dev/null +++ b/tools/refer/evaluation/bleu/__init__.py @@ -0,0 +1 @@ +__author__ = 'tylin' diff --git a/tools/refer/evaluation/bleu/bleu.py b/tools/refer/evaluation/bleu/bleu.py new file mode 100644 index 0000000..b0da5dd --- /dev/null +++ b/tools/refer/evaluation/bleu/bleu.py @@ -0,0 +1,47 @@ +#!/usr/bin/env python +# +# File Name : bleu.py +# +# Description : Wrapper for BLEU scorer. +# +# Creation Date : 06-01-2015 +# Last Modified : Thu 19 Mar 2015 09:13:28 PM PDT +# Authors : Hao Fang and Tsung-Yi Lin + +from bleu_scorer import BleuScorer + + +class Bleu: + def __init__(self, n=4): + # default compute Blue score up to 4 + self._n = n + self._hypo_for_image = {} + self.ref_for_image = {} + + def compute_score(self, gts, res): + + assert(gts.keys() == res.keys()) + imgIds = gts.keys() + + bleu_scorer = BleuScorer(n=self._n) + for id in imgIds: + hypo = res[id] + ref = gts[id] + + # Sanity check. + assert(type(hypo) is list) + assert(len(hypo) == 1) + assert(type(ref) is list) + assert(len(ref) >= 1) + + bleu_scorer += (hypo[0], ref) + + #score, scores = bleu_scorer.compute_score(option='shortest') + score, scores = bleu_scorer.compute_score(option='closest', verbose=1) + #score, scores = bleu_scorer.compute_score(option='average', verbose=1) + + # return (bleu, bleu_info) + return score, scores + + def method(self): + return "Bleu" diff --git a/tools/refer/evaluation/bleu/bleu_scorer.py b/tools/refer/evaluation/bleu/bleu_scorer.py new file mode 100644 index 0000000..3685e05 --- /dev/null +++ b/tools/refer/evaluation/bleu/bleu_scorer.py @@ -0,0 +1,263 @@ +#!/usr/bin/env python + +# bleu_scorer.py +# David Chiang + +# Copyright (c) 2004-2006 University of Maryland. All rights +# reserved. Do not redistribute without permission from the +# author. Not for commercial use. + +# Modified by: +# Hao Fang +# Tsung-Yi Lin + +'''Provides: +cook_refs(refs, n=4): Transform a list of reference sentences as strings into a form usable by cook_test(). +cook_test(test, refs, n=4): Transform a test sentence as a string (together with the cooked reference sentences) into a form usable by score_cooked(). +''' + +import copy +import sys, math, re +from collections import defaultdict + +def precook(s, n=4, out=False): + """Takes a string as input and returns an object that can be given to + either cook_refs or cook_test. This is optional: cook_refs and cook_test + can take string arguments as well.""" + words = s.split() + counts = defaultdict(int) + for k in xrange(1,n+1): + for i in xrange(len(words)-k+1): + ngram = tuple(words[i:i+k]) + counts[ngram] += 1 + return (len(words), counts) + +def cook_refs(refs, eff=None, n=4): ## lhuang: oracle will call with "average" + '''Takes a list of reference sentences for a single segment + and returns an object that encapsulates everything that BLEU + needs to know about them.''' + + reflen = [] + maxcounts = {} + for ref in refs: + rl, counts = precook(ref, n) + reflen.append(rl) + for (ngram,count) in counts.iteritems(): + maxcounts[ngram] = max(maxcounts.get(ngram,0), count) + + # Calculate effective reference sentence length. + if eff == "shortest": + reflen = min(reflen) + elif eff == "average": + reflen = float(sum(reflen))/len(reflen) + + ## lhuang: N.B.: leave reflen computaiton to the very end!! + + ## lhuang: N.B.: in case of "closest", keep a list of reflens!! (bad design) + + return (reflen, maxcounts) + +def cook_test(test, (reflen, refmaxcounts), eff=None, n=4): + '''Takes a test sentence and returns an object that + encapsulates everything that BLEU needs to know about it.''' + + testlen, counts = precook(test, n, True) + + result = {} + + # Calculate effective reference sentence length. + + if eff == "closest": + result["reflen"] = min((abs(l-testlen), l) for l in reflen)[1] + else: ## i.e., "average" or "shortest" or None + result["reflen"] = reflen + + result["testlen"] = testlen + + result["guess"] = [max(0,testlen-k+1) for k in xrange(1,n+1)] + + result['correct'] = [0]*n + for (ngram, count) in counts.iteritems(): + result["correct"][len(ngram)-1] += min(refmaxcounts.get(ngram,0), count) + + return result + +class BleuScorer(object): + """Bleu scorer. + """ + + __slots__ = "n", "crefs", "ctest", "_score", "_ratio", "_testlen", "_reflen", "special_reflen" + # special_reflen is used in oracle (proportional effective ref len for a node). + + def copy(self): + ''' copy the refs.''' + new = BleuScorer(n=self.n) + new.ctest = copy.copy(self.ctest) + new.crefs = copy.copy(self.crefs) + new._score = None + return new + + def __init__(self, test=None, refs=None, n=4, special_reflen=None): + ''' singular instance ''' + + self.n = n + self.crefs = [] + self.ctest = [] + self.cook_append(test, refs) + self.special_reflen = special_reflen + + def cook_append(self, test, refs): + '''called by constructor and __iadd__ to avoid creating new instances.''' + + if refs is not None: + self.crefs.append(cook_refs(refs)) + if test is not None: + cooked_test = cook_test(test, self.crefs[-1]) + self.ctest.append(cooked_test) ## N.B.: -1 + else: + self.ctest.append(None) # lens of crefs and ctest have to match + + self._score = None ## need to recompute + + def ratio(self, option=None): + self.compute_score(option=option) + return self._ratio + + def score_ratio(self, option=None): + '''return (bleu, len_ratio) pair''' + return (self.fscore(option=option), self.ratio(option=option)) + + def score_ratio_str(self, option=None): + return "%.4f (%.2f)" % self.score_ratio(option) + + def reflen(self, option=None): + self.compute_score(option=option) + return self._reflen + + def testlen(self, option=None): + self.compute_score(option=option) + return self._testlen + + def retest(self, new_test): + if type(new_test) is str: + new_test = [new_test] + assert len(new_test) == len(self.crefs), new_test + self.ctest = [] + for t, rs in zip(new_test, self.crefs): + self.ctest.append(cook_test(t, rs)) + self._score = None + + return self + + def rescore(self, new_test): + ''' replace test(s) with new test(s), and returns the new score.''' + + return self.retest(new_test).compute_score() + + def size(self): + assert len(self.crefs) == len(self.ctest), "refs/test mismatch! %d<>%d" % (len(self.crefs), len(self.ctest)) + return len(self.crefs) + + def __iadd__(self, other): + '''add an instance (e.g., from another sentence).''' + + if type(other) is tuple: + ## avoid creating new BleuScorer instances + self.cook_append(other[0], other[1]) + else: + assert self.compatible(other), "incompatible BLEUs." + self.ctest.extend(other.ctest) + self.crefs.extend(other.crefs) + self._score = None ## need to recompute + + return self + + def compatible(self, other): + return isinstance(other, BleuScorer) and self.n == other.n + + def single_reflen(self, option="average"): + return self._single_reflen(self.crefs[0][0], option) + + def _single_reflen(self, reflens, option=None, testlen=None): + + if option == "shortest": + reflen = min(reflens) + elif option == "average": + reflen = float(sum(reflens))/len(reflens) + elif option == "closest": + reflen = min((abs(l-testlen), l) for l in reflens)[1] + else: + assert False, "unsupported reflen option %s" % option + + return reflen + + def recompute_score(self, option=None, verbose=0): + self._score = None + return self.compute_score(option, verbose) + + def compute_score(self, option=None, verbose=0): + n = self.n + small = 1e-9 + tiny = 1e-15 ## so that if guess is 0 still return 0 + bleu_list = [[] for _ in range(n)] + + if self._score is not None: + return self._score + + if option is None: + option = "average" if len(self.crefs) == 1 else "closest" + + self._testlen = 0 + self._reflen = 0 + totalcomps = {'testlen':0, 'reflen':0, 'guess':[0]*n, 'correct':[0]*n} + + # for each sentence + for comps in self.ctest: + testlen = comps['testlen'] + self._testlen += testlen + + if self.special_reflen is None: ## need computation + reflen = self._single_reflen(comps['reflen'], option, testlen) + else: + reflen = self.special_reflen + + self._reflen += reflen + + for key in ['guess','correct']: + for k in xrange(n): + totalcomps[key][k] += comps[key][k] + + # append per image bleu score + bleu = 1. + for k in xrange(n): + bleu *= (float(comps['correct'][k]) + tiny) \ + /(float(comps['guess'][k]) + small) + bleu_list[k].append(bleu ** (1./(k+1))) + ratio = (testlen + tiny) / (reflen + small) ## N.B.: avoid zero division + if ratio < 1: + for k in xrange(n): + bleu_list[k][-1] *= math.exp(1 - 1/ratio) + + if verbose > 1: + print comps, reflen + + totalcomps['reflen'] = self._reflen + totalcomps['testlen'] = self._testlen + + bleus = [] + bleu = 1. + for k in xrange(n): + bleu *= float(totalcomps['correct'][k] + tiny) \ + / (totalcomps['guess'][k] + small) + bleus.append(bleu ** (1./(k+1))) + ratio = (self._testlen + tiny) / (self._reflen + small) ## N.B.: avoid zero division + if ratio < 1: + for k in xrange(n): + bleus[k] *= math.exp(1 - 1/ratio) + + if verbose > 0: + print totalcomps + print "ratio:", ratio + + self._score = bleus + return self._score, bleu_list diff --git a/tools/refer/evaluation/cider/__init__.py b/tools/refer/evaluation/cider/__init__.py new file mode 100644 index 0000000..3f7d85b --- /dev/null +++ b/tools/refer/evaluation/cider/__init__.py @@ -0,0 +1 @@ +__author__ = 'tylin' diff --git a/tools/refer/evaluation/cider/cider.py b/tools/refer/evaluation/cider/cider.py new file mode 100644 index 0000000..d0b99ee --- /dev/null +++ b/tools/refer/evaluation/cider/cider.py @@ -0,0 +1,54 @@ +# Filename: cider.py +# +# Description: Describes the class to compute the CIDEr (Consensus-Based Image Description Evaluation) Metric +# by Vedantam, Zitnick, and Parikh (http://arxiv.org/abs/1411.5726) +# +# Creation Date: Sun Feb 8 14:16:54 2015 +# +# Authors: Ramakrishna Vedantam and Tsung-Yi Lin + +from cider_scorer import CiderScorer +import pdb + +class Cider: + """ + Main Class to compute the CIDEr metric + + """ + def __init__(self, test=None, refs=None, n=4, sigma=6.0): + # set cider to sum over 1 to 4-grams + self._n = n + # set the standard deviation parameter for gaussian penalty + self._sigma = sigma + + def compute_score(self, gts, res): + """ + Main function to compute CIDEr score + :param hypo_for_image (dict) : dictionary with key and value + ref_for_image (dict) : dictionary with key and value + :return: cider (float) : computed CIDEr score for the corpus + """ + + assert(gts.keys() == res.keys()) + imgIds = gts.keys() + + cider_scorer = CiderScorer(n=self._n, sigma=self._sigma) + + for id in imgIds: + hypo = res[id] + ref = gts[id] + + # Sanity check. + assert(type(hypo) is list) + assert(len(hypo) == 1) + assert(type(ref) is list) + assert(len(ref) > 0) + + cider_scorer += (hypo[0], ref) + + (score, scores) = cider_scorer.compute_score() + + return score, scores + + def method(self): + return "CIDEr" \ No newline at end of file diff --git a/tools/refer/evaluation/cider/cider_scorer.py b/tools/refer/evaluation/cider/cider_scorer.py new file mode 100644 index 0000000..a73405e --- /dev/null +++ b/tools/refer/evaluation/cider/cider_scorer.py @@ -0,0 +1,192 @@ +#!/usr/bin/env python +# Tsung-Yi Lin +# Ramakrishna Vedantam + +import copy +from collections import defaultdict +import numpy as np +import pdb +import math + +def precook(s, n=4, out=False): + """ + Takes a string as input and returns an object that can be given to + either cook_refs or cook_test. This is optional: cook_refs and cook_test + can take string arguments as well. + :param s: string : sentence to be converted into ngrams + :param n: int : number of ngrams for which representation is calculated + :return: term frequency vector for occuring ngrams + """ + words = s.split() + counts = defaultdict(int) + for k in xrange(1,n+1): + for i in xrange(len(words)-k+1): + ngram = tuple(words[i:i+k]) + counts[ngram] += 1 + return counts + +def cook_refs(refs, n=4): ## lhuang: oracle will call with "average" + '''Takes a list of reference sentences for a single segment + and returns an object that encapsulates everything that BLEU + needs to know about them. + :param refs: list of string : reference sentences for some image + :param n: int : number of ngrams for which (ngram) representation is calculated + :return: result (list of dict) + ''' + return [precook(ref, n) for ref in refs] + +def cook_test(test, n=4): + '''Takes a test sentence and returns an object that + encapsulates everything that BLEU needs to know about it. + :param test: list of string : hypothesis sentence for some image + :param n: int : number of ngrams for which (ngram) representation is calculated + :return: result (dict) + ''' + return precook(test, n, True) + +class CiderScorer(object): + """CIDEr scorer. + """ + + def copy(self): + ''' copy the refs.''' + new = CiderScorer(n=self.n) + new.ctest = copy.copy(self.ctest) + new.crefs = copy.copy(self.crefs) + return new + + def __init__(self, test=None, refs=None, n=4, sigma=6.0): + ''' singular instance ''' + self.n = n + self.sigma = sigma + self.crefs = [] + self.ctest = [] + self.document_frequency = defaultdict(float) + self.cook_append(test, refs) + self.ref_len = None + + def cook_append(self, test, refs): + '''called by constructor and __iadd__ to avoid creating new instances.''' + + if refs is not None: + self.crefs.append(cook_refs(refs)) + if test is not None: + self.ctest.append(cook_test(test)) ## N.B.: -1 + else: + self.ctest.append(None) # lens of crefs and ctest have to match + + def size(self): + assert len(self.crefs) == len(self.ctest), "refs/test mismatch! %d<>%d" % (len(self.crefs), len(self.ctest)) + return len(self.crefs) + + def __iadd__(self, other): + '''add an instance (e.g., from another sentence).''' + + if type(other) is tuple: + ## avoid creating new CiderScorer instances + self.cook_append(other[0], other[1]) + else: + self.ctest.extend(other.ctest) + self.crefs.extend(other.crefs) + + return self + def compute_doc_freq(self): + ''' + Compute term frequency for reference data. + This will be used to compute idf (inverse document frequency later) + The term frequency is stored in the object + :return: None + ''' + for refs in self.crefs: + # refs, k ref captions of one image + for ngram in set([ngram for ref in refs for (ngram,count) in ref.iteritems()]): + self.document_frequency[ngram] += 1 + # maxcounts[ngram] = max(maxcounts.get(ngram,0), count) + + def compute_cider(self): + def counts2vec(cnts): + """ + Function maps counts of ngram to vector of tfidf weights. + The function returns vec, an array of dictionary that store mapping of n-gram and tf-idf weights. + The n-th entry of array denotes length of n-grams. + :param cnts: + :return: vec (array of dict), norm (array of float), length (int) + """ + vec = [defaultdict(float) for _ in range(self.n)] + length = 0 + norm = [0.0 for _ in range(self.n)] + for (ngram,term_freq) in cnts.iteritems(): + # give word count 1 if it doesn't appear in reference corpus + df = np.log(max(1.0, self.document_frequency[ngram])) + # ngram index + n = len(ngram)-1 + # tf (term_freq) * idf (precomputed idf) for n-grams + vec[n][ngram] = float(term_freq)*(self.ref_len - df) + # compute norm for the vector. the norm will be used for computing similarity + norm[n] += pow(vec[n][ngram], 2) + + if n == 1: + length += term_freq + norm = [np.sqrt(n) for n in norm] + return vec, norm, length + + def sim(vec_hyp, vec_ref, norm_hyp, norm_ref, length_hyp, length_ref): + ''' + Compute the cosine similarity of two vectors. + :param vec_hyp: array of dictionary for vector corresponding to hypothesis + :param vec_ref: array of dictionary for vector corresponding to reference + :param norm_hyp: array of float for vector corresponding to hypothesis + :param norm_ref: array of float for vector corresponding to reference + :param length_hyp: int containing length of hypothesis + :param length_ref: int containing length of reference + :return: array of score for each n-grams cosine similarity + ''' + delta = float(length_hyp - length_ref) + # measure consine similarity + val = np.array([0.0 for _ in range(self.n)]) + for n in range(self.n): + # ngram + for (ngram,count) in vec_hyp[n].iteritems(): + # vrama91 : added clipping + val[n] += min(vec_hyp[n][ngram], vec_ref[n][ngram]) * vec_ref[n][ngram] + + if (norm_hyp[n] != 0) and (norm_ref[n] != 0): + val[n] /= (norm_hyp[n]*norm_ref[n]) + + assert(not math.isnan(val[n])) + # vrama91: added a length based gaussian penalty + val[n] *= np.e**(-(delta**2)/(2*self.sigma**2)) + return val + + # compute log reference length + self.ref_len = np.log(float(len(self.crefs))) + + scores = [] + for test, refs in zip(self.ctest, self.crefs): + # compute vector for test captions + vec, norm, length = counts2vec(test) + # compute vector for ref captions + score = np.array([0.0 for _ in range(self.n)]) + for ref in refs: + vec_ref, norm_ref, length_ref = counts2vec(ref) + score += sim(vec, vec_ref, norm, norm_ref, length, length_ref) + # change by vrama91 - mean of ngram scores, instead of sum + score_avg = np.mean(score) + # divide by number of references + score_avg /= len(refs) + # multiply score by 10 + score_avg *= 10.0 + # append score of an image to the score list + scores.append(score_avg) + return scores + + def compute_score(self, option=None, verbose=0): + # compute idf + self.compute_doc_freq() + # assert to check document frequency + assert(len(self.ctest) >= max(self.document_frequency.values())) + # compute cider score + score = self.compute_cider() + # debug + # print score + return np.mean(np.array(score)), np.array(score) \ No newline at end of file diff --git a/tools/refer/evaluation/meteor/__init__.py b/tools/refer/evaluation/meteor/__init__.py new file mode 100644 index 0000000..3f7d85b --- /dev/null +++ b/tools/refer/evaluation/meteor/__init__.py @@ -0,0 +1 @@ +__author__ = 'tylin' diff --git a/tools/refer/evaluation/meteor/data/paraphrase-en.gz b/tools/refer/evaluation/meteor/data/paraphrase-en.gz new file mode 100644 index 0000000..88033c8 Binary files /dev/null and b/tools/refer/evaluation/meteor/data/paraphrase-en.gz differ diff --git a/tools/refer/evaluation/meteor/meteor-1.5.jar b/tools/refer/evaluation/meteor/meteor-1.5.jar new file mode 100644 index 0000000..a833bc0 Binary files /dev/null and b/tools/refer/evaluation/meteor/meteor-1.5.jar differ diff --git a/tools/refer/evaluation/meteor/meteor.py b/tools/refer/evaluation/meteor/meteor.py new file mode 100644 index 0000000..6472948 --- /dev/null +++ b/tools/refer/evaluation/meteor/meteor.py @@ -0,0 +1,76 @@ +#!/usr/bin/env python + +# Python wrapper for METEOR implementation, by Xinlei Chen +# Acknowledge Michael Denkowski for the generous discussion and help + +import os +import sys +import subprocess +import threading + +# Assumes meteor-1.5.jar is in the same directory as meteor.py. Change as needed. +METEOR_JAR = 'meteor-1.5.jar' +# print METEOR_JAR + +class Meteor: + + def __init__(self): + self.meteor_cmd = ['java', '-jar', '-Xmx2G', METEOR_JAR, \ + '-', '-', '-stdio', '-l', 'en', '-norm'] + self.meteor_p = subprocess.Popen(self.meteor_cmd, \ + cwd=os.path.dirname(os.path.abspath(__file__)), \ + stdin=subprocess.PIPE, \ + stdout=subprocess.PIPE, \ + stderr=subprocess.PIPE) + # Used to guarantee thread safety + self.lock = threading.Lock() + + def compute_score(self, gts, res): + assert(gts.keys() == res.keys()) + imgIds = gts.keys() + scores = [] + + eval_line = 'EVAL' + self.lock.acquire() + for i in imgIds: + assert(len(res[i]) == 1) + stat = self._stat(res[i][0], gts[i]) + eval_line += ' ||| {}'.format(stat) + + self.meteor_p.stdin.write('{}\n'.format(eval_line)) + for i in range(0,len(imgIds)): + scores.append(float(self.meteor_p.stdout.readline().strip())) + score = float(self.meteor_p.stdout.readline().strip()) + self.lock.release() + + return score, scores + + def method(self): + return "METEOR" + + def _stat(self, hypothesis_str, reference_list): + # SCORE ||| reference 1 words ||| reference n words ||| hypothesis words + hypothesis_str = hypothesis_str.replace('|||','').replace(' ',' ') + score_line = ' ||| '.join(('SCORE', ' ||| '.join(reference_list), hypothesis_str)) + self.meteor_p.stdin.write('{}\n'.format(score_line)) + return self.meteor_p.stdout.readline().strip() + + def _score(self, hypothesis_str, reference_list): + self.lock.acquire() + # SCORE ||| reference 1 words ||| reference n words ||| hypothesis words + hypothesis_str = hypothesis_str.replace('|||','').replace(' ',' ') + score_line = ' ||| '.join(('SCORE', ' ||| '.join(reference_list), hypothesis_str)) + self.meteor_p.stdin.write('{}\n'.format(score_line)) + stats = self.meteor_p.stdout.readline().strip() + eval_line = 'EVAL ||| {}'.format(stats) + # EVAL ||| stats + self.meteor_p.stdin.write('{}\n'.format(eval_line)) + score = float(self.meteor_p.stdout.readline().strip()) + self.lock.release() + return score + + def __exit__(self): + self.lock.acquire() + self.meteor_p.stdin.close() + self.meteor_p.wait() + self.lock.release() diff --git a/tools/refer/evaluation/readme.txt b/tools/refer/evaluation/readme.txt new file mode 100644 index 0000000..4491b5a --- /dev/null +++ b/tools/refer/evaluation/readme.txt @@ -0,0 +1,11 @@ +This folder contains modified coco-caption evaluation, which is downloaded from https://github.com/tylin/coco-caption.git +and refEvaluation which is to be called by the refer algorithm. + +More specifically, this folder contains: +1. bleu/ +2. cider/ +3. meteor/ +4. rouge/ +5. tokenizer/ +6. __init__.py +7. refEvaluation.py diff --git a/tools/refer/evaluation/refEvaluation.py b/tools/refer/evaluation/refEvaluation.py new file mode 100644 index 0000000..bfa11ed --- /dev/null +++ b/tools/refer/evaluation/refEvaluation.py @@ -0,0 +1,136 @@ +from tokenizer.ptbtokenizer import PTBTokenizer +from bleu.bleu import Bleu +from meteor.meteor import Meteor +from rouge.rouge import Rouge +from cider.cider import Cider + +""" +Input: refer and Res = [{ref_id, sent}] + +Things of interest +evalRefs - list of ['ref_id', 'CIDEr', 'Bleu_1', 'Bleu_2', 'Bleu_3', 'Bleu_4', 'ROUGE_L', 'METEOR'] +eval - dict of {metric: score} +refToEval - dict of {ref_id: ['ref_id', 'CIDEr', 'Bleu_1', 'Bleu_2', 'Bleu_3', 'Bleu_4', 'ROUGE_L', 'METEOR']} +""" + +class RefEvaluation: + def __init__ (self, refer, Res): + """ + :param refer: refer class of current dataset + :param Res: [{'ref_id', 'sent'}] + """ + self.evalRefs = [] + self.eval = {} + self.refToEval = {} + self.refer = refer + self.Res = Res + + def evaluate(self): + + evalRefIds = [ann['ref_id'] for ann in self.Res] + + refToGts = {} + for ref_id in evalRefIds: + ref = self.refer.Refs[ref_id] + gt_sents = [sent['sent'].encode('ascii', 'ignore').decode('ascii') for sent in ref['sentences']] # up to 3 expressions + refToGts[ref_id] = gt_sents + refToRes = {ann['ref_id']: [ann['sent']] for ann in self.Res} + + print 'tokenization...' + tokenizer = PTBTokenizer() + self.refToRes = tokenizer.tokenize(refToRes) + self.refToGts = tokenizer.tokenize(refToGts) + + # ================================================= + # Set up scorers + # ================================================= + print 'setting up scorers...' + scorers = [ + (Bleu(4), ["Bleu_1", "Bleu_2", "Bleu_3", "Bleu_4"]), + (Meteor(),"METEOR"), + (Rouge(), "ROUGE_L"), + (Cider(), "CIDEr") + ] + + # ================================================= + # Compute scores + # ================================================= + for scorer, method in scorers: + print 'computing %s score...'%(scorer.method()) + score, scores = scorer.compute_score(self.refToGts, self.refToRes) + if type(method) == list: + for sc, scs, m in zip(score, scores, method): + self.setEval(sc, m) + self.setRefToEvalRefs(scs, self.refToGts.keys(), m) + print "%s: %0.3f"%(m, sc) + else: + self.setEval(score, method) + self.setRefToEvalRefs(scores, self.refToGts.keys(), method) + print "%s: %0.3f"%(method, score) + self.setEvalRefs() + + def setEval(self, score, method): + self.eval[method] = score + + def setRefToEvalRefs(self, scores, refIds, method): + for refId, score in zip(refIds, scores): + if not refId in self.refToEval: + self.refToEval[refId] = {} + self.refToEval[refId]["ref_id"] = refId + self.refToEval[refId][method] = score + + def setEvalRefs(self): + self.evalRefs = [eval for refId, eval in self.refToEval.items()] + + +if __name__ == '__main__': + + import os.path as osp + import sys + ROOT_DIR = osp.abspath(osp.join(osp.dirname(__file__), '..', '..')) + sys.path.insert(0, osp.join(ROOT_DIR, 'lib', 'datasets')) + from refer import REFER + + # load refer of dataset + dataset = 'refcoco' + refer = REFER(dataset, splitBy = 'google') + + # mimic some Res + val_refIds = refer.getRefIds(split='test') + ref_id = 49767 + print "GD: %s" % refer.Refs[ref_id]['sentences'] + Res = [{'ref_id': ref_id, 'sent': 'left bottle'}] + + # evaluate some refer expressions + refEval = RefEvaluation(refer, Res) + refEval.evaluate() + + # print output evaluation scores + for metric, score in refEval.eval.items(): + print '%s: %.3f'%(metric, score) + + # demo how to use evalImgs to retrieve low score result + # evals = [eva for eva in refEval.evalRefs if eva['CIDEr']<30] + # print 'ground truth sents' + # refId = evals[0]['ref_id'] + # print 'refId: %s' % refId + # print [sent['sent'] for sent in refer.Refs[refId]['sentences']] + # + # print 'generated sent (CIDEr score %0.1f)' % (evals[0]['CIDEr']) + + # print refEval.refToEval[8] + + + + + + + + + + + + + + + diff --git a/tools/refer/evaluation/rouge/__init__.py b/tools/refer/evaluation/rouge/__init__.py new file mode 100644 index 0000000..43a773e --- /dev/null +++ b/tools/refer/evaluation/rouge/__init__.py @@ -0,0 +1 @@ +__author__ = 'vrama91' diff --git a/tools/refer/evaluation/rouge/rouge.py b/tools/refer/evaluation/rouge/rouge.py new file mode 100644 index 0000000..3a10f5a --- /dev/null +++ b/tools/refer/evaluation/rouge/rouge.py @@ -0,0 +1,105 @@ +#!/usr/bin/env python +# +# File Name : rouge.py +# +# Description : Computes ROUGE-L metric as described by Lin and Hovey (2004) +# +# Creation Date : 2015-01-07 06:03 +# Author : Ramakrishna Vedantam + +import numpy as np +import pdb + +def my_lcs(string, sub): + """ + Calculates longest common subsequence for a pair of tokenized strings + :param string : list of str : tokens from a string split using whitespace + :param sub : list of str : shorter string, also split using whitespace + :returns: length (list of int): length of the longest common subsequence between the two strings + + Note: my_lcs only gives length of the longest common subsequence, not the actual LCS + """ + if(len(string)< len(sub)): + sub, string = string, sub + + lengths = [[0 for i in range(0,len(sub)+1)] for j in range(0,len(string)+1)] + + for j in range(1,len(sub)+1): + for i in range(1,len(string)+1): + if(string[i-1] == sub[j-1]): + lengths[i][j] = lengths[i-1][j-1] + 1 + else: + lengths[i][j] = max(lengths[i-1][j] , lengths[i][j-1]) + + return lengths[len(string)][len(sub)] + +class Rouge(): + ''' + Class for computing ROUGE-L score for a set of candidate sentences for the MS COCO test set + + ''' + def __init__(self): + # vrama91: updated the value below based on discussion with Hovey + self.beta = 1.2 + + def calc_score(self, candidate, refs): + """ + Compute ROUGE-L score given one candidate and references for an image + :param candidate: str : candidate sentence to be evaluated + :param refs: list of str : COCO reference sentences for the particular image to be evaluated + :returns score: int (ROUGE-L score for the candidate evaluated against references) + """ + assert(len(candidate)==1) + assert(len(refs)>0) + prec = [] + rec = [] + + # split into tokens + token_c = candidate[0].split(" ") + + for reference in refs: + # split into tokens + token_r = reference.split(" ") + # compute the longest common subsequence + lcs = my_lcs(token_r, token_c) + prec.append(lcs/float(len(token_c))) + rec.append(lcs/float(len(token_r))) + + prec_max = max(prec) + rec_max = max(rec) + + if(prec_max!=0 and rec_max !=0): + score = ((1 + self.beta**2)*prec_max*rec_max)/float(rec_max + self.beta**2*prec_max) + else: + score = 0.0 + return score + + def compute_score(self, gts, res): + """ + Computes Rouge-L score given a set of reference and candidate sentences for the dataset + Invoked by evaluate_captions.py + :param hypo_for_image: dict : candidate / test sentences with "image name" key and "tokenized sentences" as values + :param ref_for_image: dict : reference MS-COCO sentences with "image name" key and "tokenized sentences" as values + :returns: average_score: float (mean ROUGE-L score computed by averaging scores for all the images) + """ + assert(gts.keys() == res.keys()) + imgIds = gts.keys() + + score = [] + for id in imgIds: + hypo = res[id] + ref = gts[id] + + score.append(self.calc_score(hypo, ref)) + + # Sanity check. + assert(type(hypo) is list) + assert(len(hypo) == 1) + assert(type(ref) is list) + assert(len(ref) > 0) + + average_score = np.mean(np.array(score)) + return average_score, np.array(score) + + def method(self): + return "Rouge" diff --git a/tools/refer/evaluation/tokenizer/__init__.py b/tools/refer/evaluation/tokenizer/__init__.py new file mode 100644 index 0000000..71357a4 --- /dev/null +++ b/tools/refer/evaluation/tokenizer/__init__.py @@ -0,0 +1 @@ +__author__ = 'hfang' diff --git a/tools/refer/evaluation/tokenizer/ptbtokenizer.py b/tools/refer/evaluation/tokenizer/ptbtokenizer.py new file mode 100644 index 0000000..346ebe7 --- /dev/null +++ b/tools/refer/evaluation/tokenizer/ptbtokenizer.py @@ -0,0 +1,68 @@ +#!/usr/bin/env python +# +# File Name : ptbtokenizer.py +# +# Description : Do the PTB Tokenization and remove punctuations. +# +# Creation Date : 29-12-2014 +# Last Modified : Thu Mar 19 09:53:35 2015 +# Authors : Hao Fang and Tsung-Yi Lin + +import os +import sys +import subprocess +import tempfile +import itertools + +# path to the stanford corenlp jar +STANFORD_CORENLP_3_4_1_JAR = 'stanford-corenlp-3.4.1.jar' + +# punctuations to be removed from the sentences +PUNCTUATIONS = ["''", "'", "``", "`", "-LRB-", "-RRB-", "-LCB-", "-RCB-", \ + ".", "?", "!", ",", ":", "-", "--", "...", ";"] + +class PTBTokenizer: + """Python wrapper of Stanford PTBTokenizer""" + + def tokenize(self, captions_for_image): + cmd = ['java', '-cp', STANFORD_CORENLP_3_4_1_JAR, \ + 'edu.stanford.nlp.process.PTBTokenizer', \ + '-preserveLines', '-lowerCase'] + + # ====================================================== + # prepare data for PTB Tokenizer + # ====================================================== + final_tokenized_captions_for_image = {} + image_id = [k for k, v in captions_for_image.items() for _ in range(len(v))] + sentences = '\n'.join([c.replace('\n', ' ') for k, v in captions_for_image.items() for c in v]) + + # ====================================================== + # save sentences to temporary file + # ====================================================== + path_to_jar_dirname=os.path.dirname(os.path.abspath(__file__)) + tmp_file = tempfile.NamedTemporaryFile(delete=False, dir=path_to_jar_dirname) + tmp_file.write(sentences) + tmp_file.close() + + # ====================================================== + # tokenize sentence + # ====================================================== + cmd.append(os.path.basename(tmp_file.name)) + p_tokenizer = subprocess.Popen(cmd, cwd=path_to_jar_dirname, \ + stdout=subprocess.PIPE) + token_lines = p_tokenizer.communicate(input=sentences.rstrip())[0] + lines = token_lines.split('\n') + # remove temp file + os.remove(tmp_file.name) + + # ====================================================== + # create dictionary for tokenized captions + # ====================================================== + for k, line in zip(image_id, lines): + if not k in final_tokenized_captions_for_image: + final_tokenized_captions_for_image[k] = [] + tokenized_caption = ' '.join([w for w in line.rstrip().split(' ') \ + if w not in PUNCTUATIONS]) + final_tokenized_captions_for_image[k].append(tokenized_caption) + + return final_tokenized_captions_for_image diff --git a/tools/refer/evaluation/tokenizer/stanford-corenlp-3.4.1.jar b/tools/refer/evaluation/tokenizer/stanford-corenlp-3.4.1.jar new file mode 100644 index 0000000..3cfa0a0 Binary files /dev/null and b/tools/refer/evaluation/tokenizer/stanford-corenlp-3.4.1.jar differ diff --git a/tools/refer/external/.gitignore b/tools/refer/external/.gitignore new file mode 100644 index 0000000..20d59f9 --- /dev/null +++ b/tools/refer/external/.gitignore @@ -0,0 +1,3 @@ +*.pyc +_mask.so +_mask.c diff --git a/tools/refer/external/README.md b/tools/refer/external/README.md new file mode 100644 index 0000000..0a0a681 --- /dev/null +++ b/tools/refer/external/README.md @@ -0,0 +1 @@ +The codes inside this folder are copied from pycocotools: https://github.com/pdollar/coco \ No newline at end of file diff --git a/tools/refer/external/__init__.py b/tools/refer/external/__init__.py new file mode 100644 index 0000000..3f7d85b --- /dev/null +++ b/tools/refer/external/__init__.py @@ -0,0 +1 @@ +__author__ = 'tylin' diff --git a/tools/refer/external/_mask.cpython-36m-x86_64-linux-gnu.so b/tools/refer/external/_mask.cpython-36m-x86_64-linux-gnu.so new file mode 100755 index 0000000..4fed2ad Binary files /dev/null and b/tools/refer/external/_mask.cpython-36m-x86_64-linux-gnu.so differ diff --git a/tools/refer/external/_mask.cpython-37m-x86_64-linux-gnu.so b/tools/refer/external/_mask.cpython-37m-x86_64-linux-gnu.so new file mode 100755 index 0000000..f1cd270 Binary files /dev/null and b/tools/refer/external/_mask.cpython-37m-x86_64-linux-gnu.so differ diff --git a/tools/refer/external/_mask.pyx b/tools/refer/external/_mask.pyx new file mode 100644 index 0000000..9f0562c --- /dev/null +++ b/tools/refer/external/_mask.pyx @@ -0,0 +1,291 @@ +# distutils: language = c +# distutils: sources = external/maskApi.c + +#************************************************************************** +# Microsoft COCO Toolbox. version 2.0 +# Data, paper, and tutorials available at: http://mscoco.org/ +# Code written by Piotr Dollar and Tsung-Yi Lin, 2015. +# Licensed under the Simplified BSD License [see coco/license.txt] +#************************************************************************** + +__author__ = 'tsungyi' + +# import both Python-level and C-level symbols of Numpy +# the API uses Numpy to interface C and Python +import numpy as np +cimport numpy as np +from libc.stdlib cimport malloc, free + +# intialized Numpy. must do. +np.import_array() + +# import numpy C function +# we use PyArray_ENABLEFLAGS to make Numpy ndarray responsible to memoery management +cdef extern from "numpy/arrayobject.h": + void PyArray_ENABLEFLAGS(np.ndarray arr, int flags) + +# Declare the prototype of the C functions in MaskApi.h +cdef extern from "maskApi.h": + ctypedef unsigned int uint + ctypedef unsigned long siz + ctypedef unsigned char byte + ctypedef double* BB + ctypedef struct RLE: + siz h, + siz w, + siz m, + uint* cnts, + void rlesInit( RLE **R, siz n ) + void rleEncode( RLE *R, const byte *M, siz h, siz w, siz n ) + void rleDecode( const RLE *R, byte *mask, siz n ) + void rleMerge( const RLE *R, RLE *M, siz n, bint intersect ) + void rleArea( const RLE *R, siz n, uint *a ) + void rleIou( RLE *dt, RLE *gt, siz m, siz n, byte *iscrowd, double *o ) + void bbIou( BB dt, BB gt, siz m, siz n, byte *iscrowd, double *o ) + void rleToBbox( const RLE *R, BB bb, siz n ) + void rleFrBbox( RLE *R, const BB bb, siz h, siz w, siz n ) + void rleFrPoly( RLE *R, const double *xy, siz k, siz h, siz w ) + char* rleToString( const RLE *R ) + void rleFrString( RLE *R, char *s, siz h, siz w ) + +# python class to wrap RLE array in C +# the class handles the memory allocation and deallocation +cdef class RLEs: + cdef RLE *_R + cdef siz _n + + def __cinit__(self, siz n =0): + rlesInit(&self._R, n) + self._n = n + + # free the RLE array here + def __dealloc__(self): + if self._R is not NULL: + for i in range(self._n): + free(self._R[i].cnts) + free(self._R) + def __getattr__(self, key): + if key == 'n': + return self._n + raise AttributeError(key) + +# python class to wrap Mask array in C +# the class handles the memory allocation and deallocation +cdef class Masks: + cdef byte *_mask + cdef siz _h + cdef siz _w + cdef siz _n + + def __cinit__(self, h, w, n): + self._mask = malloc(h*w*n* sizeof(byte)) + self._h = h + self._w = w + self._n = n + # def __dealloc__(self): + # the memory management of _mask has been passed to np.ndarray + # it doesn't need to be freed here + + # called when passing into np.array() and return an np.ndarray in column-major order + def __array__(self): + cdef np.npy_intp shape[1] + shape[0] = self._h*self._w*self._n + # Create a 1D array, and reshape it to fortran/Matlab column-major array + ndarray = np.PyArray_SimpleNewFromData(1, shape, np.NPY_UINT8, self._mask).reshape((self._h, self._w, self._n), order='F') + # The _mask allocated by Masks is now handled by ndarray + PyArray_ENABLEFLAGS(ndarray, np.NPY_OWNDATA) + return ndarray + +# internal conversion from Python RLEs object to compressed RLE format +def _toString(RLEs Rs): + cdef siz n = Rs.n + cdef bytes py_string + cdef char* c_string + objs = [] + for i in range(n): + c_string = rleToString( &Rs._R[i] ) + py_string = c_string + objs.append({ + 'size': [Rs._R[i].h, Rs._R[i].w], + 'counts': py_string + }) + free(c_string) + return objs + +# internal conversion from compressed RLE format to Python RLEs object +def _frString(rleObjs): + cdef siz n = len(rleObjs) + Rs = RLEs(n) + cdef bytes py_string + cdef char* c_string + for i, obj in enumerate(rleObjs): + py_string = str(obj['counts']) + c_string = py_string + rleFrString( &Rs._R[i], c_string, obj['size'][0], obj['size'][1] ) + return Rs + +# encode mask to RLEs objects +# list of RLE string can be generated by RLEs member function +def encode(np.ndarray[np.uint8_t, ndim=3, mode='fortran'] mask): + h, w, n = mask.shape[0], mask.shape[1], mask.shape[2] + cdef RLEs Rs = RLEs(n) + rleEncode(Rs._R,mask.data,h,w,n) + objs = _toString(Rs) + return objs + +# decode mask from compressed list of RLE string or RLEs object +def decode(rleObjs): + cdef RLEs Rs = _frString(rleObjs) + h, w, n = Rs._R[0].h, Rs._R[0].w, Rs._n + masks = Masks(h, w, n) + rleDecode( Rs._R, masks._mask, n ); + return np.array(masks) + +def merge(rleObjs, bint intersect=0): + cdef RLEs Rs = _frString(rleObjs) + cdef RLEs R = RLEs(1) + rleMerge(Rs._R, R._R, Rs._n, intersect) + obj = _toString(R)[0] + return obj + +def area(rleObjs): + cdef RLEs Rs = _frString(rleObjs) + cdef uint* _a = malloc(Rs._n* sizeof(uint)) + rleArea(Rs._R, Rs._n, _a) + cdef np.npy_intp shape[1] + shape[0] = Rs._n + a = np.array((Rs._n, ), dtype=np.uint8) + a = np.PyArray_SimpleNewFromData(1, shape, np.NPY_UINT32, _a) + PyArray_ENABLEFLAGS(a, np.NPY_OWNDATA) + return a + +# iou computation. support function overload (RLEs-RLEs and bbox-bbox). +def iou( dt, gt, pyiscrowd ): + def _preproc(objs): + if len(objs) == 0: + return objs + if type(objs) == np.ndarray: + if len(objs.shape) == 1: + objs = objs.reshape((objs[0], 1)) + # check if it's Nx4 bbox + if not len(objs.shape) == 2 or not objs.shape[1] == 4: + raise Exception('numpy ndarray input is only for *bounding boxes* and should have Nx4 dimension') + objs = objs.astype(np.double) + elif type(objs) == list: + # check if list is in box format and convert it to np.ndarray + isbox = np.all(np.array([(len(obj)==4) and ((type(obj)==list) or (type(obj)==np.ndarray)) for obj in objs])) + isrle = np.all(np.array([type(obj) == dict for obj in objs])) + if isbox: + objs = np.array(objs, dtype=np.double) + if len(objs.shape) == 1: + objs = objs.reshape((1,objs.shape[0])) + elif isrle: + objs = _frString(objs) + else: + raise Exception('list input can be bounding box (Nx4) or RLEs ([RLE])') + else: + raise Exception('unrecognized type. The following type: RLEs (rle), np.ndarray (box), and list (box) are supported.') + return objs + def _rleIou(RLEs dt, RLEs gt, np.ndarray[np.uint8_t, ndim=1] iscrowd, siz m, siz n, np.ndarray[np.double_t, ndim=1] _iou): + rleIou( dt._R, gt._R, m, n, iscrowd.data, _iou.data ) + def _bbIou(np.ndarray[np.double_t, ndim=2] dt, np.ndarray[np.double_t, ndim=2] gt, np.ndarray[np.uint8_t, ndim=1] iscrowd, siz m, siz n, np.ndarray[np.double_t, ndim=1] _iou): + bbIou( dt.data, gt.data, m, n, iscrowd.data, _iou.data ) + def _len(obj): + cdef siz N = 0 + if type(obj) == RLEs: + N = obj.n + elif len(obj)==0: + pass + elif type(obj) == np.ndarray: + N = obj.shape[0] + return N + # convert iscrowd to numpy array + cdef np.ndarray[np.uint8_t, ndim=1] iscrowd = np.array(pyiscrowd, dtype=np.uint8) + # simple type checking + cdef siz m, n + dt = _preproc(dt) + gt = _preproc(gt) + m = _len(dt) + n = _len(gt) + if m == 0 or n == 0: + return [] + if not type(dt) == type(gt): + raise Exception('The dt and gt should have the same data type, either RLEs, list or np.ndarray') + + # define local variables + cdef double* _iou = 0 + cdef np.npy_intp shape[1] + # check type and assign iou function + if type(dt) == RLEs: + _iouFun = _rleIou + elif type(dt) == np.ndarray: + _iouFun = _bbIou + else: + raise Exception('input data type not allowed.') + _iou = malloc(m*n* sizeof(double)) + iou = np.zeros((m*n, ), dtype=np.double) + shape[0] = m*n + iou = np.PyArray_SimpleNewFromData(1, shape, np.NPY_DOUBLE, _iou) + PyArray_ENABLEFLAGS(iou, np.NPY_OWNDATA) + _iouFun(dt, gt, iscrowd, m, n, iou) + return iou.reshape((m,n), order='F') + +def toBbox( rleObjs ): + cdef RLEs Rs = _frString(rleObjs) + cdef siz n = Rs.n + cdef BB _bb = malloc(4*n* sizeof(double)) + rleToBbox( Rs._R, _bb, n ) + cdef np.npy_intp shape[1] + shape[0] = 4*n + bb = np.array((1,4*n), dtype=np.double) + bb = np.PyArray_SimpleNewFromData(1, shape, np.NPY_DOUBLE, _bb).reshape((n, 4)) + PyArray_ENABLEFLAGS(bb, np.NPY_OWNDATA) + return bb + +def frBbox(np.ndarray[np.double_t, ndim=2] bb, siz h, siz w ): + cdef siz n = bb.shape[0] + Rs = RLEs(n) + rleFrBbox( Rs._R, bb.data, h, w, n ) + objs = _toString(Rs) + return objs + +def frPoly( poly, siz h, siz w ): + cdef np.ndarray[np.double_t, ndim=1] np_poly + n = len(poly) + Rs = RLEs(n) + for i, p in enumerate(poly): + np_poly = np.array(p, dtype=np.double, order='F') + rleFrPoly( &Rs._R[i], np_poly.data, len(np_poly)/2, h, w ) + objs = _toString(Rs) + return objs + +def frUncompressedRLE(ucRles, siz h, siz w): + cdef np.ndarray[np.uint32_t, ndim=1] cnts + cdef RLE R + cdef uint *data + n = len(ucRles) + objs = [] + for i in range(n): + Rs = RLEs(1) + cnts = np.array(ucRles[i]['counts'], dtype=np.uint32) + # time for malloc can be saved here but it's fine + data = malloc(len(cnts)* sizeof(uint)) + for j in range(len(cnts)): + data[j] = cnts[j] + R = RLE(ucRles[i]['size'][0], ucRles[i]['size'][1], len(cnts), data) + Rs._R[0] = R + objs.append(_toString(Rs)[0]) + return objs + +def frPyObjects(pyobj, siz h, w): + if type(pyobj) == np.ndarray: + objs = frBbox(pyobj, h, w ) + elif type(pyobj) == list and len(pyobj[0]) == 4: + objs = frBbox(pyobj, h, w ) + elif type(pyobj) == list and len(pyobj[0]) > 4: + objs = frPoly(pyobj, h, w ) + elif type(pyobj) == list and type(pyobj[0]) == dict: + objs = frUncompressedRLE(pyobj, h, w) + else: + raise Exception('input type is not supported.') + return objs diff --git a/tools/refer/external/mask.py b/tools/refer/external/mask.py new file mode 100644 index 0000000..67ef560 --- /dev/null +++ b/tools/refer/external/mask.py @@ -0,0 +1,82 @@ +__author__ = 'tsungyi' + +import tools.refer.external._mask as _mask + +# Interface for manipulating masks stored in RLE format. +# +# RLE is a simple yet efficient format for storing binary masks. RLE +# first divides a vector (or vectorized image) into a series of piecewise +# constant regions and then for each piece simply stores the length of +# that piece. For example, given M=[0 0 1 1 1 0 1] the RLE counts would +# be [2 3 1 1], or for M=[1 1 1 1 1 1 0] the counts would be [0 6 1] +# (note that the odd counts are always the numbers of zeros). Instead of +# storing the counts directly, additional compression is achieved with a +# variable bitrate representation based on a common scheme called LEB128. +# +# Compression is greatest given large piecewise constant regions. +# Specifically, the size of the RLE is proportional to the number of +# *boundaries* in M (or for an image the number of boundaries in the y +# direction). Assuming fairly simple shapes, the RLE representation is +# O(sqrt(n)) where n is number of pixels in the object. Hence space usage +# is substantially lower, especially for large simple objects (large n). +# +# Many common operations on masks can be computed directly using the RLE +# (without need for decoding). This includes computations such as area, +# union, intersection, etc. All of these operations are linear in the +# size of the RLE, in other words they are O(sqrt(n)) where n is the area +# of the object. Computing these operations on the original mask is O(n). +# Thus, using the RLE can result in substantial computational savings. +# +# The following API functions are defined: +# encode - Encode binary masks using RLE. +# decode - Decode binary masks encoded via RLE. +# merge - Compute union or intersection of encoded masks. +# iou - Compute intersection over union between masks. +# area - Compute area of encoded masks. +# toBbox - Get bounding boxes surrounding encoded masks. +# frPyObjects - Convert polygon, bbox, and uncompressed RLE to encoded RLE mask. +# +# Usage: +# Rs = encode( masks ) +# masks = decode( Rs ) +# R = merge( Rs, intersect=false ) +# o = iou( dt, gt, iscrowd ) +# a = area( Rs ) +# bbs = toBbox( Rs ) +# Rs = frPyObjects( [pyObjects], h, w ) +# +# In the API the following formats are used: +# Rs - [dict] Run-length encoding of binary masks +# R - dict Run-length encoding of binary mask +# masks - [hxwxn] Binary mask(s) (must have type np.ndarray(dtype=uint8) in column-major order) +# iscrowd - [nx1] list of np.ndarray. 1 indicates corresponding gt image has crowd region to ignore +# bbs - [nx4] Bounding box(es) stored as [x y w h] +# poly - Polygon stored as [[x1 y1 x2 y2...],[x1 y1 ...],...] (2D list) +# dt,gt - May be either bounding boxes or encoded masks +# Both poly and bbs are 0-indexed (bbox=[0 0 1 1] encloses first pixel). +# +# Finally, a note about the intersection over union (iou) computation. +# The standard iou of a ground truth (gt) and detected (dt) object is +# iou(gt,dt) = area(intersect(gt,dt)) / area(union(gt,dt)) +# For "crowd" regions, we use a modified criteria. If a gt object is +# marked as "iscrowd", we allow a dt to match any subregion of the gt. +# Choosing gt' in the crowd gt that best matches the dt can be done using +# gt'=intersect(dt,gt). Since by definition union(gt',dt)=dt, computing +# iou(gt,dt,iscrowd) = iou(gt',dt) = area(intersect(gt,dt)) / area(dt) +# For crowd gt regions we use this modified criteria above for the iou. +# +# To compile run "python setup.py build_ext --inplace" +# Please do not contact us for help with compiling. +# +# Microsoft COCO Toolbox. version 2.0 +# Data, paper, and tutorials available at: http://mscoco.org/ +# Code written by Piotr Dollar and Tsung-Yi Lin, 2015. +# Licensed under the Simplified BSD License [see coco/license.txt] + +encode = _mask.encode +decode = _mask.decode +iou = _mask.iou +merge = _mask.merge +area = _mask.area +toBbox = _mask.toBbox +frPyObjects = _mask.frPyObjects diff --git a/tools/refer/external/maskApi.c b/tools/refer/external/maskApi.c new file mode 100644 index 0000000..85e3979 --- /dev/null +++ b/tools/refer/external/maskApi.c @@ -0,0 +1,230 @@ +/************************************************************************** +* Microsoft COCO Toolbox. version 2.0 +* Data, paper, and tutorials available at: http://mscoco.org/ +* Code written by Piotr Dollar and Tsung-Yi Lin, 2015. +* Licensed under the Simplified BSD License [see coco/license.txt] +**************************************************************************/ +#include "maskApi.h" +#include +#include + +uint umin( uint a, uint b ) { return (ab) ? a : b; } + +void rleInit( RLE *R, siz h, siz w, siz m, uint *cnts ) { + R->h=h; R->w=w; R->m=m; R->cnts=(m==0)?0:malloc(sizeof(uint)*m); + siz j; if(cnts) for(j=0; jcnts[j]=cnts[j]; +} + +void rleFree( RLE *R ) { + free(R->cnts); R->cnts=0; +} + +void rlesInit( RLE **R, siz n ) { + siz i; *R = (RLE*) malloc(sizeof(RLE)*n); + for(i=0; i0 ) { + c=umin(ca,cb); cc+=c; ct=0; + ca-=c; if(!ca && a0) { + crowd=iscrowd!=NULL && iscrowd[g]; + if(dt[d].h!=gt[g].h || dt[d].w!=gt[g].w) { o[g*m+d]=-1; continue; } + siz ka, kb, a, b; uint c, ca, cb, ct, i, u; int va, vb; + ca=dt[d].cnts[0]; ka=dt[d].m; va=vb=0; + cb=gt[g].cnts[0]; kb=gt[g].m; a=b=1; i=u=0; ct=1; + while( ct>0 ) { + c=umin(ca,cb); if(va||vb) { u+=c; if(va&&vb) i+=c; } ct=0; + ca-=c; if(!ca && athr) keep[j]=0; + } + } +} + +void bbIou( BB dt, BB gt, siz m, siz n, byte *iscrowd, double *o ) { + double h, w, i, u, ga, da; siz g, d; int crowd; + for( g=0; gthr) keep[j]=0; + } + } +} + +void rleToBbox( const RLE *R, BB bb, siz n ) { + siz i; for( i=0; id?1:c=dy && xs>xe) || (dxye); + if(flip) { t=xs; xs=xe; xe=t; t=ys; ys=ye; ye=t; } + s = dx>=dy ? (double)(ye-ys)/dx : (double)(xe-xs)/dy; + if(dx>=dy) for( d=0; d<=dx; d++ ) { + t=flip?dx-d:d; u[m]=t+xs; v[m]=(int)(ys+s*t+.5); m++; + } else for( d=0; d<=dy; d++ ) { + t=flip?dy-d:d; v[m]=t+ys; u[m]=(int)(xs+s*t+.5); m++; + } + } + /* get points along y-boundary and downsample */ + free(x); free(y); k=m; m=0; double xd, yd; + x=malloc(sizeof(int)*k); y=malloc(sizeof(int)*k); + for( j=1; jw-1 ) continue; + yd=(double)(v[j]h) yd=h; yd=ceil(yd); + x[m]=(int) xd; y[m]=(int) yd; m++; + } + /* compute rle encoding given y-boundary points */ + k=m; a=malloc(sizeof(uint)*(k+1)); + for( j=0; j0) b[m++]=a[j++]; else { + j++; if(jm, p=0; long x; int more; + char *s=malloc(sizeof(char)*m*6); + for( i=0; icnts[i]; if(i>2) x-=(long) R->cnts[i-2]; more=1; + while( more ) { + char c=x & 0x1f; x >>= 5; more=(c & 0x10) ? x!=-1 : x!=0; + if(more) c |= 0x20; c+=48; s[p++]=c; + } + } + s[p]=0; return s; +} + +void rleFrString( RLE *R, char *s, siz h, siz w ) { + siz m=0, p=0, k; long x; int more; uint *cnts; + while( s[m] ) m++; cnts=malloc(sizeof(uint)*m); m=0; + while( s[p] ) { + x=0; k=0; more=1; + while( more ) { + char c=s[p]-48; x |= (c & 0x1f) << 5*k; + more = c & 0x20; p++; k++; + if(!more && (c & 0x10)) x |= -1 << 5*k; + } + if(m>2) x+=(long) cnts[m-2]; cnts[m++]=(uint) x; + } + rleInit(R,h,w,m,cnts); free(cnts); +} diff --git a/tools/refer/external/maskApi.h b/tools/refer/external/maskApi.h new file mode 100644 index 0000000..ebc7892 --- /dev/null +++ b/tools/refer/external/maskApi.h @@ -0,0 +1,60 @@ +/************************************************************************** +* Microsoft COCO Toolbox. version 2.0 +* Data, paper, and tutorials available at: http://mscoco.org/ +* Code written by Piotr Dollar and Tsung-Yi Lin, 2015. +* Licensed under the Simplified BSD License [see coco/license.txt] +**************************************************************************/ +#pragma once + +typedef unsigned int uint; +typedef unsigned long siz; +typedef unsigned char byte; +typedef double* BB; +typedef struct { siz h, w, m; uint *cnts; } RLE; + +/* Initialize/destroy RLE. */ +void rleInit( RLE *R, siz h, siz w, siz m, uint *cnts ); +void rleFree( RLE *R ); + +/* Initialize/destroy RLE array. */ +void rlesInit( RLE **R, siz n ); +void rlesFree( RLE **R, siz n ); + +/* Encode binary masks using RLE. */ +void rleEncode( RLE *R, const byte *mask, siz h, siz w, siz n ); + +/* Decode binary masks encoded via RLE. */ +void rleDecode( const RLE *R, byte *mask, siz n ); + +/* Compute union or intersection of encoded masks. */ +void rleMerge( const RLE *R, RLE *M, siz n, int intersect ); + +/* Compute area of encoded masks. */ +void rleArea( const RLE *R, siz n, uint *a ); + +/* Compute intersection over union between masks. */ +void rleIou( RLE *dt, RLE *gt, siz m, siz n, byte *iscrowd, double *o ); + +/* Compute non-maximum suppression between bounding masks */ +void rleNms( RLE *dt, siz n, uint *keep, double thr ); + +/* Compute intersection over union between bounding boxes. */ +void bbIou( BB dt, BB gt, siz m, siz n, byte *iscrowd, double *o ); + +/* Compute non-maximum suppression between bounding boxes */ +void bbNms( BB dt, siz n, uint *keep, double thr ); + +/* Get bounding boxes surrounding encoded masks. */ +void rleToBbox( const RLE *R, BB bb, siz n ); + +/* Convert bounding boxes to encoded masks. */ +void rleFrBbox( RLE *R, const BB bb, siz h, siz w, siz n ); + +/* Convert polygon to encoded mask. */ +void rleFrPoly( RLE *R, const double *xy, siz k, siz h, siz w ); + +/* Get compressed string representation of encoded mask. */ +char* rleToString( const RLE *R ); + +/* Convert from compressed string representation of encoded mask. */ +void rleFrString( RLE *R, char *s, siz h, siz w ); diff --git a/tools/refer/pyEvalDemo.ipynb b/tools/refer/pyEvalDemo.ipynb new file mode 100644 index 0000000..5493637 --- /dev/null +++ b/tools/refer/pyEvalDemo.ipynb @@ -0,0 +1,308 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "from refer import REFER\n", + "import numpy as np\n", + "import sys\n", + "import os.path as osp\n", + "import json\n", + "import matplotlib.pyplot as plt\n", + "from matplotlib.patches import Rectangle" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "loading dataset refcoco into memory...\n", + "creating index...\n", + "index created.\n", + "DONE (t=9.47s)\n" + ] + } + ], + "source": [ + "data_root = './data' # contains refclef, refcoco, refcoco+, refcocog and images\n", + "dataset = 'refcoco'\n", + "splitBy = 'unc'\n", + "refer = REFER(data_root, dataset, splitBy)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# 1. Evaluate Refering Expressions by Language Metrics" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "sys.path.insert(0, './evaluation')\n", + "from refEvaluation import RefEvaluation" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{u'sent': u'man in black', u'ref_id': 47}\n" + ] + } + ], + "source": [ + "# Here's our example expression file\n", + "sample_expr_file = json.load(open('test/sample_expressions_testA.json', 'r'))\n", + "sample_exprs = sample_expr_file['predictions']\n", + "print sample_exprs[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "tokenization...\n", + "setting up scorers...\n", + "computing Bleu score...\n", + "{'reflen': 5356, 'guess': [5009, 3034, 1477, 275], 'testlen': 5009, 'correct': [2576, 580, 112, 2]}\n", + "ratio: 0.935212845407\n", + "Bleu_1: 0.480\n", + "Bleu_2: 0.293\n", + "Bleu_3: 0.182\n", + "Bleu_4: 0.080\n", + "computing METEOR score...\n", + "METEOR: 0.172\n", + "computing Rouge score...\n", + "ROUGE_L: 0.414\n", + "computing CIDEr score...\n", + "CIDEr: 0.669\n" + ] + } + ], + "source": [ + "refEval = RefEvaluation(refer, sample_exprs)\n", + "refEval.evaluate()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# 2. Evaluate Referring Expressions by Duplicate Rate" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "108/750 (14.40%) images have duplicate predicted sentences.\n" + ] + } + ], + "source": [ + "# evalue how many images contain duplicate expressions\n", + "pred_refToSent = {int(it['ref_id']): it['sent'] for it in sample_exprs}\n", + "pred_imgToSents = {}\n", + "for ref_id, pred_sent in pred_refToSent.items():\n", + " image_id = refer.Refs[ref_id]['image_id']\n", + " pred_imgToSents[image_id] = pred_imgToSents.get(image_id, []) + [pred_sent]\n", + "# count duplicate\n", + "duplicate = 0\n", + "for image_id, sents in pred_imgToSents.items():\n", + " if len(set(sents)) < len(sents):\n", + " duplicate += 1\n", + "ratio = duplicate*100.0 / len(pred_imgToSents)\n", + "print '%s/%s (%.2f%%) images have duplicate predicted sentences.' % (duplicate, len(pred_imgToSents), ratio)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# 3.Evaluate Referring Comprehension" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "# IoU function\n", + "def computeIoU(box1, box2):\n", + " # each box is of [x1, y1, w, h]\n", + " inter_x1 = max(box1[0], box2[0])\n", + " inter_y1 = max(box1[1], box2[1])\n", + " inter_x2 = min(box1[0]+box1[2]-1, box2[0]+box2[2]-1)\n", + " inter_y2 = min(box1[1]+box1[3]-1, box2[1]+box2[3]-1)\n", + "\n", + " if inter_x1 < inter_x2 and inter_y1 < inter_y2:\n", + " inter = (inter_x2-inter_x1+1)*(inter_y2-inter_y1+1)\n", + " else:\n", + " inter = 0\n", + " union = box1[2]*box1[3] + box2[2]*box2[3] - inter\n", + " return float(inter)/union" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# randomly sample one ref\n", + "ref_ids = refer.getRefIds()\n", + "ref_id = ref_ids[np.random.randint(0, len(ref_ids))]\n", + "ref = refer.Refs[ref_id]\n", + "\n", + "# let's fake one bounding box by randomly picking one instance inside this image\n", + "image_id = ref['image_id']\n", + "anns = refer.imgToAnns[image_id]\n", + "ann = anns[np.random.randint(0, len(anns))]" + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "1. person bending\n", + "2. man\n", + "3. the person bending over\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAMsAAAEACAYAAAAdo4LwAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzsvXe0Jdd13vnb55yqG19+nbvR3WjEBkGCJADShASAECmR\nw2EwFShqFExzybJspbHWWLIVCMmUrJG1Zlkyh9J4lCzJosIwSCLFKGYQIEDERmoADXSOL793U9U5\ne88fdV/3a7AbxJAEW57V31pvvXvr1j2hzv7OjlVXzIyLuIiL+NpwF3oAF3ER/6PgIlku4iKeJy6S\n5SIu4nniIlku4iKeJy6S5SIu4nniIlku4iKeJ14QsojI60TkcRF5UkR+7oXo4yIu4lsN+WbnWUTE\nA3uB1wBHgHuAt5vZY9/Uji7iIr7FeCE0y43AU2a238xK4C+AN78A/VzERXxL8UKQZQtwaM37w8Nj\nF3ER/0PjhSDLxfqZi/j/JcIL0OYRYNua99uotMtpiMhFQl3EP2qYmTz72AtBlq8Al4vIDuAo8Dbg\n7c8+qRgs0ul00LIPtoIrC4qiR54FEjm+3iJkAcOQZNTaE1hMqBgiGWYJZwUpFZhvoEWXmEpCyElJ\nqWWKmUPNyBvTSBjBiavUnhnOVdfiV27/Vd51+y8jSPWZgOkZLotU55kBWqJmIAlUEAk4H0CMg5/5\n98z7dWzY8R1s2n4d5wqciAhmdvozwTBApFLwt99+O+9617uG/Q2/nwoUo1LYgrgMcYIlATFMIyYG\nJnifVS2mznA+HmEAlqHiEZeBJSADSYh4rFxCrMQsIc4Tiz6qBf/xt/6AX/6Ff4dYQRRP8AlNhrgc\nfIZJwJmjM/8IPoxTH92GOBAzEHfOeT/XsTPX+auP3X777dx+++1fdT2fL05f7zV9rL4+F8732Ted\nLGYWReQngI8DHviDc0XCThy8n1DzpAGk1MNJTq3WJPmAeEfeHEGKPtFB3mhSdFcIeQ0d9Ch78yhQ\nr49RDApqDZC8Sc07ykGXeqNFjAXiPJZ6OC2qC+QETBEnCAJyWlwrEiCIgaoOF1NwLmEmiICKIuJB\nMkwMw2GW0KHYqwxQLajiGgEzELGqPe+QoQiLc8MFc0MWnlkgEYeZDsks4Go4q87HJcQUcGgAh8ec\nAxQdEtyJgzAyXAwQaqglvAkaF/E+JyJ4UyjmSeIJaqhvQMgI2QSS+lWbaQkTw5Fj0RASkgy0QFA0\ntGhNXQsuQpJqLrI6N8AcSMLMnznGVwviWuF8LiH+evHsNr/ePl4IzYKZfRT46HOekyJWgHMBtZzW\n6CSaIv3eCidnjpFq93Hp1G40ePrFCVq1Nr1BQb01hW9OkGmiKAoaY6OgELJ2tbs2xxCBkEfUBOlE\n+p1T5KNNkkIyTx5qqDgwHY5XQRNJCwzDSYa4SmMkNbwYKfYRhKQd0IS4DB8CaobzGV48QRxlfwnT\nhGoCSjA/1BwOxWHigYTIqnOnYP7MddEEQwqDqxZWjJIBg9nDlGXJ1KZdOM0wEs6FIanP7Jaq1bxE\nIKUB3nkMwflxlD5OlzDXgHwM53JME14MNUXNY1kNE8G5gOTjIILooKK6r4FkpDTAiWAKTsJpDqzu\n4maxIrBFkAwhw1yJEBCR0xvS2u9UY35+gnw+TfVCkG0VLwhZng8ETz20iNojZBmmCSuVkVabkekX\nE2dO4AQkJcTl9Is+pjDwK5gELCn1VhMtE+J9tTCaMIvE1Me7Bi4D35jCM8788UcR7TM6eQnLiz1w\nRp7lvOqGXdjgOCkpztdIBiIFmgKSVsDVOH74Pg7vvZ/lfsHLb3g97emNlIMuQk5SRQpX7cACmjqU\nxQouFZX2MU/mM8Q5BoMePmsgHlLZBQwVT5aNYb7GLbfciohVekocZpVQGEqxfJhP/fGvMDbZZqrd\nYmL3brbt/mdDjVIJyarAOLdqBhnO1StNaq4ilzVRV0Mw1AWCCUkcCjjvKjPU4JZbX4urr0MVnFQa\nWX2G6grBRsHnOFXMS0UUAYdbI8QB5wUsVB9ipN4xyKZxvlbpc1vVppVEDE87PfZKMwu33HLLOcza\n4VZjVpnXpsNzzh2zqq6Nssrqr4dU3/Sk5PPqVMROHbwLMc/yyiKj49N47yv/QgJzeogRvxG0QYrL\npDig1WqhOFQFjyPU27i8AUkItQb4xtB88dV/KjPKKFFLMFjAhxaat6uFt4TXPiggA9TqqPbwrknR\nXyCr1QlZDZVWZT7pMqXVCLFLigNMPL7ZIuLJXM6Bz/4CK2GM+tR17Lz626HXw4kw6PcxHRC8Yq6N\nr7UhBJxVBKp0yKom8OfcMSs/qdI3zoykggzmoDGKG+53Is8d2Fzddc/V/vAMVNf6UWebRme+lzAL\niNhZO/nZ51SMUysRyTFLw+MKbqjRdUjeIDjLSBorDTjUVliPWCwQ8glMapX56Yemp62OV09vDKu+\noIiACckpEnuIZDhfQygxq6wFGxrEkFgl1znm+y1x8J8ftNrFxiY20Gg1sFSw75HPcenltzFYSrTG\nAo16AMag7NMvIxKEWl4ny2tojKRYoiJovyQ0BOc84HGSUM0wFLGAahfnwYqCMh4hz2s4a2K+gVmB\nd5MEiWAZyaAxshFD0OFupVpggxUyt0SpAZ+PonGWsldCFI7N7KcO9CXD95dYPPYYeZggtKbwo9NI\nLHFkuExI+EoYnIAoppXvIny1IA+ZDJZwEiozsjuDNaaYO3YHU9vfiDn7mkQBzhLqc2FV0IRVn82f\n09Y386fHtfbzlNKwnYQTIPUpyxWybBQJzeG5jhTLSoOVp0ipx2DQpzEyjWqOD0qMPXxoksyR1cfQ\nskBYwhEoBwXO19EwAuZPB2nMKq1RFY8oRoLeDNKYQsyTrESSIMwi1EgIIjlqESc5IoL3nq+FC6dZ\nDt1LrdXCopF5jw8tkvY41jvBhKuh/RLEk9cyup0VGqPTOBcIvonkTUpN1EOdbu8kkiJZyPF5C5MR\nHAOQPuJzxI1jRJxYpWHUwBIiAdMCT45lgWSCc4IM5TOlAd5nqCRSOYDBAJdlpHIJ7+ogGfgM53Mw\nOH7HrzDvR5ja8Z2sW78bWY1MWcK5jKQKUuJ1QCyWkfp6LPYQn+NdrdItIqCKUFD2FihjF3Ft6u0J\nhAxBqsiYJsQi+GwoIKev63mv+VrNcv7zViN15yaWmVUjELd6xlkEjzEOBdgqn8+GWsoBlrDYAUYx\nV+KkEnKcw+Eq/1IS2DAqqYbpIk4HSEqo1XBZjlFASiQJBCckLcEUT06yHhoLXMjxoY66Ng6txmFL\n4CcwV6sCH8/SiKtm7Jpj/3g0S1n00ZRINqDZnKTbOUg+kkNnCTe5k7F1uxgU8+TUCM1lxClOBe8c\n/d4cghK1QbMxSkqOLG+hqSDkgWQOxwgqBam/QOrPYH6MWnuU5c4itWYDLQoa9SYJwVIXqFEM5rHS\nEJ8RPGhpiMtwGnH1EcxnhNpoZeIB6NDcEyPiEXIyqWFD29+JH0aCCpyBOo+5FpLXQBPeBpi0MRxu\n1V53HsjIWxvJzcBVglmtneEQJGRA9rwc2rXCbJYwey4NI1+zvbIckOX10xHCtU66935oDoGUC1VU\nUAK4OqIOsxJhAUkRcQ4plqC+ASv7uHxsSBipfCQPhFFMDdUSYh8bzGJxgFoJPqdQR57VMBXIwOV1\npL5pGL6OVWRO82rjlA3Da1n5RmvNxrVEeS5cMLLUvCelgthfhizQ1wOMFtcytu4SijKxfOIBYhin\nE/u4WJA3x3Akohmdzhy13BP7iZhN4BstUlniLNKZPUAyxWcNGrVRAgZZQKxH0V1Blpc59Phhlk7s\n57IXX0VobSDUxvB5m7w2grpU+TnqcT4naUkIo1WmQ8FJrIIBrgoqmBMcShCl21lCULwaiYj5DFSB\nWJHHBCyrzPayg4VRxBmmIFL5UU48MXVxvl3lLYBKiKv/q3i+kZ+156UU8b7G/9cii7WEC1mtcvrd\nuUy0ijCqCmGMVY9MFMQZzsaHmwFgjhQmEFEIeXUNeJbgmlRazAnUmlX0TgSnEVwYjm0AZBgBXOWN\nVNPLcAj4BDSHPtZq219tRj4fXDCytKYvRZwyrtVOOXd4kb7v02hfTd0y6qKVr+BreBmg/ZJB5xTO\ne6bW7aToLpPXc8ro8C7D+xoS2jTz9eArf8CZI6knEMEZvojkmy6hvemaoRBZZZ7FAS52WFlYoFZv\nYtQRAScJGyzSjccwLQlZjrkGWQhoMgZxgbLfx+X1ypyo5WjRoaSsciPJYak7TCQObWTnsNglRUNc\nF08LBFTD6cULfnSNeQA2DABw2jF9/tGcted5n7MaOft6UH3Pcz6yrbbrnDttzq1yfDXhe+Zkw1OF\n1GWonZ89XiQgZlW+zAzEV0Lvwhrzs3Hm/CEZ7Kw5rjVTV199fVVeF8xnUY2IOdRFsIBYRNFq93Zh\nmES0Kk/hZKjSMxyQ0jBGLw7TCE6GO4dWhopWGfpkVQ47pj6WIiFrVlnsZ4+HYTZ8jb1uGE5BhWEy\nM0IqqkibhEprmGE6QMg48eVf5WCvyWVXv5HRVgsak5C3wBwOxazEWSARcJQYWUVoBGU1N6k4V4Va\nzzivaTi31YTl1x/6/EbwQucwLnS/54jsfVWnF+xOyZQiRbGCxUQqC0wFL/lqnpBO7zhVaM8QIs6y\nSi2bVDuMSziJdOYPYEkRg4WZwwQE+ouUcVCpYRxiUgm35MDZYcaKdMPgrQyTgDbMhPsqd4AKQga+\nhYQRzNcR81U5ia+Bc6jWafpAY/IS3Mgu0mAZLFSJO/M41wJfmQq4MIyABcqkVPkQhmFvq0wy1dPX\nypQqd6nxWT7It26je74m39pynm9Vv98sqOpzjv2CkaU7v5/F7gN0e0/TW3ycxeJv0V6f+dknEVNC\n7TGwDDMlkg2TWAAJ7wwsYOYZmboU7wPgmZzeWWWqm1NkoTkkluB8i+Dy02rYObcmcTc8dtrAGWqS\n4VHcMJpTNTX8M3CCcw4hI+Eo44AUE7VsBHFGPrINL4LghmHJKmHmhqaH84K4Ku2g5SJiCaWPSEFv\n+SCa+qSkWOyhRXeYb6naUdWvubDfKM7V9pl8yXN9L73gY3shsLpxPhc5LxhZ6u0pWmwjyAhH9D7K\n8gqWuseZn3+GlcEd1P2rMesiGEVvmTINSNqjCleeaefZk1xrqpyxnZWZmUdOF0h+44t5JhFW9Vll\nowutckHnHpec85jzAZ9Pgsvw5IjUqI9sxYdGlc0PLaTWJlnCiqXTWe21Gfu1Y3n2vNbu9ufa+Z9L\nG5w5rpglBoMecCan8tWQodl47k9faHJ/o+1/LS12wRx8743oEpKP0HLreerJRymeOcDxmcOcmD7G\n/J9/lO9804+QT+2iVq+jAmlQUsYu3nu8y3HOna4xOlMXFSnLLrW8QRkV7zMQmJq+fJhYPmObfqP2\nsIhU+QBLqBpJ4/P7ztr3p1VWqnI3ppXJhwG+cvCtMt3I28O5rlYhn03ateblGSGvgh3Y+ffFMxrD\nn0XAM/VbgoijVqucaefWbhRnrmXlc50dRDiX2fhcidFzXa9nr9Pa92ey+HpOf+656s7W1qc9n9qy\nC0aW/qAg1NeDGXWpceX6bSymcV5+6xWoXUntbeuqymPALODN8NmZyEYsB/TKkmazSSU0q465I8/q\nlEWPkLfWVOPWh7ueO++ifC2ca7HNUZVXoFVM357/Yq09ZuYQVZIIblg+r5owqyqTUywIIUc1EVOf\nWt6q5jOsNMAc4hXwp0mT+gsMtKBeb7Iyc5D2+quY3/9ZWusvod64HKwqbLRyEbLmaTvD1JGkCmMj\nVUm/WRXiVnogOc4NEAtgVZFmNfE0vCDPIqaUmGbVOLWqXVslvIiRLMOIeAuYJECHuZJq7asrtFoK\nZKhVtx0IDrN51EbBHGV5lDzfQFI/1PaOWI0eHRZ0GoaJnZaWavxxeDuDks5RFb2KC0aWvD5emSCW\nsTB/kEvX3UR741VVoE8Ep+VwF82A1UWooKqErEaW158VahQ8gpGT1aodzp3O9sH5wqbPR7jPqj06\n+9vDf46l3goqrgqunmcXPSfhhm2rCKrdYYUAa/wqJcvqwzxGIIRWRS6pnH7B6A0WyfMM71uoVnkZ\n3xinruCdMLbhWix1mdpxMynFKsKW4Ejxp2xs/hDLS09g1mB8fCvqEnHxKUJ7G8G3WK1dK1OXpf0f\nZ3LHiynZhM+qKGYVHvZoOYeEDYgM67CG9+FoPAZe6Czupdm+maSJ4HOMhGkgFscJoUGyxyl1hLq/\nGnVzYCNV6RK+CiFLZQL2Bw+ThWmc24Azq6oarMSHSbS/F1xB8gWO3QRdJBIw3yWUQumFYDlYl+TX\nkbkRNB1GbYEgV4HOfpV8rOKC+SwhNHAuAzF2jX0/ztUIKaCacCmQ1KPD8gsZmgdr4/hr358WQFYL\nDiGZkoalLVVFakK1PCcZ1poxaoZqHIah19rzZ5sWp49bohh0KHsFx1bm8Lr6/YhqWflHw3ZSimf1\nddbYRVg4/iXKJz9wOudQ9THc4U8nJnXoM1Rmh+FAQmWeujpqcXhtHCIOX1koJBsgvlZpA1aqMh6X\nmGhcj8czNnoV46Pb0dSn358jNKYI1sAosdTBzMhCk6nL38pcugPnP4srO0QAHVQaRx8fZtwjakU1\nPi1w6WkGRcHI6G14Z1XU0yLEBSStkNUmIYzi5SXU2E6Ke5A0gpcAcY7C/va0qevEU6tdR5ARND2F\naBPiYbydQtLjON8BfwyXFEl7iXIS/AZEN5LcEl13N86NgTRBGtCfx1mfFf8IkWdYyD5wXpm9gA/Z\nE7Cq1smFatd03sh9HUhkq5W5cqYqtyqWWxNSXePUrc2ROAFvghMDMQb9OVDFdO6cDnCFodOvK0j/\nIYrevqpOy4Y+gsXh+UNB1RJBETGWlw+x1I9sXjeCsYAJCEovPT6sZTKMPt75Kq1oa0l6hiyjm15F\nuPRNiIVqrjiiDGuxREEMwSESTmsHGc5xbHQLSFVnttpu0oKUQElDAZUqwOfGQIU5/+fMlB9DJVZl\n/M7wvkmrsZ5QW4f6EkuL1Z2UpCqGrSWT2W14eQ24UYItYsziUMqsh1DgXD4cW8L0Sfq1BTJRot5B\nXx9BJUIsKPxjkD6FK07iLRG1hzqPy7bR8/dRxHlIQs4caBcr91W5qbRMtQvMUPIg4vcgbhThGtRd\ni+pGymyZE/olojhK/gJHpPQDWnYJZs/Q5RhoQGtjpDDCqL0dkc24WD+vxF7gJ1Iqpxb2klJVYmI2\nvCnIGWXRA0mkVJWOmyXEpLI9T5On2tmrfMzZu/Wg7AIeVaHemEBtGQkTp53ZyqE7E9WpHOEIUofa\nRnx+WZUpXK28OH2uA4mIZHSKxzAV2iPbeDofUN8sJF1EyzlSOSCzEUqFyBwr9mWs9zmK3pPE1Dut\n7Va1jZkRLJLlI5iL1bWQDk7vROMRUgLMKNMMOizyrBzaMDRR/ekolPfD0hN1IMcAQTUiRMzA4lMo\nTzFe3krTBbzF02ZONWEFBpTlPMokhHG8BIp4FJEBC/FhVAOJz/CU/QyPyx9DnMWlm6ukqgloiYkg\n/mpq6dX0skdZ9idQ61amjkYWisdZzptotkgRF3FSktwTmAvU4y5EBlhWx+wyzJo4N4IUJehREvdB\nNoYEwXEbSBtChkgdH7YQBw021n+CPI2BvBgXcuqyGy83kmKHlr+SDMeSvoeon8frPD2+xIS95bzS\neuGSkhSYQSPPCS7gpYaT6gbdcrAXZzMkm4c0g5Ul0RbQdAItZ0hWZdZxAmmWpCvosCLXoiEKeWhW\nN5QxD+YYcAr0AM4MI1XVAjFVpNFELPaSymMoEXN1eukxTFdIWmLlSnUzFgkTsHIfqXgS7+t0iy/y\nyGPvY9el07ziql/Ay0a8n0ZCkyzbQm9+HxRd2jaN0idrXoJIUe3S5vFBhmaboLYMKlhZVHmX5Fgo\nBGEGYZbD/BLL9gzCcdROoRQkfRTVJTSdwtLfI5oYlAdIAkvyAYq0D3ElXqr7QAzwsqsq+vSXMOHf\ngUqOodXnaRlTxcqSetiA2X2oVq5vyDYSqRFsG2qOno3Q8XA83s+K30fmAyQPZZeBHaaIT9JLX8J0\nLz5eQxxMULetYBkH8/8MOs6K3g3lRsTVwNXwdgWkObrZH+HYSLIPImkMswcxaYJb4aj/GGgiK8Hk\nICX3MyPvRsolTBeA9dTrr6QsHkezaWrliyCuEMkQc7hwJdH2EX2HEfl+PNsoeZqaXk30Y+eV2QtG\nFlEYxL3M5O8lpQ5mA1KcZbH/eRIn6acjJH0YmONQ+BlIxzE5indjSNqHxXthMGDRfwblK1jq0C8e\nwvyAjtyHphV65ZdA5tF0kMxKevJZevIn9NL7IXUxO0qyOcSEzF2GygIDvY++3cOM3k2MJ3BR6MtD\niPZB53Bpnp6cpMsi+5c+wOHjDmmMMblpMy50UUoEwXmHiGds6mp8fQunun/Fid4Rlu3PkJQwEuIS\nagGzDqqJrv11pWG9Eu0QK+5TTOabMHc1psJk+kE6/hMs+s9huh6fDqBulFI/gsohChyl9emnp0kM\naMSryeUazDKWi48yO3gPjj4WcrxcWhmAKWG6gqkDUVQCpl2QASUDXNiNkCrT0OUE8+T1T9Dx76bv\nHmNr8WNcZTdSS+ND0u8DZsE9juM4uBlm48NEd4haaHPYfx7TPlvTD7O+8RZOpPvph09jPIaUh3HM\nMJD9rNhhjAfJ+F6wJVJI9OUzlHKCzf6n8e5mygA+Xk1IDabSd1Fko5idYBD/BtGSkF9elRrJKSIn\n8SmBlDhXx+kOpOhhrKNMozhpEvQYLp48r8xewHKX+1jWB1mXfowo+zlW/A5d/1ek/ACH+DOSOaLN\nE8MsI7adqEfBNtC1LyOyBZE63fBx8vQyNPWI7MFnxynjIfK4grg+wY+CPknp9lCGU9R5M3m8hUUd\nEMu7SS5Q9B5kxf4rJsuYNnni0F3UbSdqh/FhnCIsEGWKggNgUwzYR822kfsd7Bp9C37dHQzad7Ju\ndAqRETrFXsr+V0hxafgQDK1MxfwGNozXWVaPZW1W0p0sFScgnsClR/HSQ+hXflxaQSXQLq9DUwtB\nGcjTOPOMsItx2wFETLbgdYLcvRW0QbDL8C5n/uE9ZBYQv73SoOUKNX8LY9mtDNLDVUjVTuGsC6GN\nkzYmaRjtC2ARY4DDI6mLWcIk4WKJpQFZXE/fjtCkgXEckTGWOk8wK39KdCucck8g2sbZOMK11MON\nHC/vpZu+QpaOk3yPFbmPmPbwUnk3KpOYHAV/kIHbSz3tZFJ+HacFybrgrgc7zAoHydxVqAtIXCBL\nOfhputnjDOgS9AhOpsjdNM5yLM1S2PsR9xSFNFEW6ejfk+JXMFsg5u9nQR6mLhtwcjlkrzjzsI9z\n4IIVUi6UH6ejyqTfiOkSexb+gJdM/htKVli0u9ioP4Z3gS4nifJJPJeS6zhJEqV1GchRxrXLEgdp\nyi04drAsD1F3gRi34F3EpwLNpmj/h6u+5XO8iG8e9JciS/opRuQVqDwGuhuxDtUtzn1wJ+m6R6nb\nPyfoPKXchedmnO1DaJLwVRrIjZLkIfJyJ4P8CyQT6rYFsetAM4wvg7san1/CP6qbvzqyl1Ff46h+\nhkRkJR6toj/pBJv8T2JuD5YuI2Oeehhnb/EEzXAH68t/wSD/CkmERW1xyvYx6R0tewaTUTJ5Ocp+\n5vxHyMyxQf/ZhZriRXyTENMibb69yrml7ag9Q5AxkDGilAQSPWao+wLTp/G6GZGDJGZxtkhpPTQU\nNNLlaDhGERQ0I7cdlH6UYKdwbgdWXsoCnznvOC6YGdYvT3EwPoALYzT8JrwLZDHn4cN/RFRDrUZf\nHsLcNC7tYKreYJ28gRn/BOYSGdtohMhl/h08le5kiXnEtzlevpsF+QLOtnCsN2Be77pQU7yIbxKS\nzKMux4gEqYN/ktI/xcCdpLAP05NnaOi34eIRJG0FOYaTZhUhs2kavIJOuYzRJ+P7CPJavL2OAWNE\n9lXPPKOGMMHx8KXzjuOCaZYpt46VaMzaE0zrq9g8voOZtMTO7bs4Wd7OuuzVZHYlz/Q+zqVhB1kI\niNbZ7KdZKbdQo43319PxjzPa3Q2tAZku00jfTyaLdHWGl9R/CnlWTdTBf/v7bKq9hZn4PqbkZRRp\nCcl2Ukt7OB4eZSxdQ9fvpW0/yEq6i8nsVdzXfyfXZR+hrx+n7V6FWkD9J9ByHXPyAH/6x5OMj72U\nk8cOsH7bRj74//w9N7zsRnbu2E5n0GVQRAyhVsvwWc70SBsrjSf27OG+++5icWWRS664Gs27/OL/\nejUTm2DUPYLQJKZ/St0dQOImluUA3XAHrXgdp/yd1IrttOuj5Obpo7TiKxmEPbQG1zFwfeq2TC8/\nzIPx97gm/C+0mKQoJ6iFazkl/4l6mqbLMwRfox5/hLaNghsDq+4ksCynjL/AjHZZiCdZl9/ElL2N\nrv1nkmsxYj+BOGNJPsaI3cyy/1sGIvSSpyVbaVjGbPk4Wb3FxvJNzPm/Y0pfj9k4sB/FM5/937T1\n1Ti9nIEYpvvxrKP969eeXrNca5g3kjN00CfP3kyKDxDdQQijMLgKsg+C/hQr+R/ixONtkWBtUhhQ\nWoGTQ0SuJMQHGdgRDtXu4jJuRwYrmDOsvI+jtb+ublk+Dy6Yz/IPx7+bmyb/A0v+bgo7zAb3L7HS\nsyKfHJZ6zRPFeHphkRe13shy2oevP0ojvp6T/j209VWYD7hiM1NsYDH/PJndwnJxBLMZFrqJLe1F\nHu8+wit/572n+5796TsZG3kxSXpkNk7X7gCpMZ+eZhNXs2In6PgDBO+ZKL+LGD6K6c3kuh7cfkQX\nWQgdmvYifv+/38H80kb63R5j4y0O7D/MxPgkm7Zu4sTMKcqiJM9yBMOiEF2i3WpjLiMTR6NeIzhB\nHbRrdXwA1UVe8qI5Nm8sWFd/HRoPoqJ0/b3UrORn/80essxwTmm6Ot3UY/t1h3jn236MRfYyLT9E\nsgNkrMPrCEjkhH8vo/pmMplCYpeBfxixKVJoMy9/SdPW0dBd5NzG/NLn8e0v01t8IyPjLbx7lLpd\nyYFyDxqbpEW3AAAgAElEQVQ+x674G3Tqn6Ed30ApRxFdj7enOZi/j43xB8i5kjI9ReZ6HAr3sTG9\nmRn3fjKZZcS+izyNoUzgrMBsFPULYEfIbQs99yjBrsV0ltpv7D69ZvHfH8C5UVIZca4OWZ0UP4Dp\nVnr5MUgjOH0Zd5c/yquzn6XwfWbkTkZkI7NWsjPtINouoii5CJb2Et1l9MJnGE3fR5IHMBlQEDlu\n93FFePc/Lp9FWUBlEaddRuxW9sb3cGX2M3jtkclNHLE/w4pxNtemmeHz1P1WxvS7mXUfIzHGwXQH\nG93N+LCHxbJPXW/E6wg1X6OgxsjYPDV7I+tHD53Vbxp5lGibWLG7MVHGeSWl1Niol7FQ+0Pq5VvI\nXMFYfCnGYZbZSCYr+HA/YjfiZMAHPv9pDu0pifMNpjcHBl04efwZ0qDL/qdPcOToITSWOJ8Rk6Im\nmCpBHM7X8c4NywwcQTKcM7JGTqPZYKxdZ35hmhAyavWnyfOSkbDE1ddsZ9OWeU7OL5DVlRp15kIP\nlzytxwty3UnmDgOL5EwTrRg+etazXr8flcTAjlELl1PXG1Ef8Npni74JrIb4nZgLTLR2s6xdJkcP\n0GOEQmd4Wh7kKv8OjGuwfIkirdDnTh7r/59cG/4Vha+zefB2Fn2XiTRA8gYL9mUaVmLpDkYZpyc1\nTvL3bLUfwfwSA56gnq5CykS3NoOL63EywJdP47Ktz5KWx9kvf8ETqcdt/n9jYJG6vJYifBGXDlC6\naxnRFW5tvJcBnyPYLibtxZgWNGWBwkeW5IuMaRuJV+LkWmbCH1DXG3hUfo3d+kOITuPdI2yUm88r\ns1+TLCLyh8AbgJNmdu3w2CTwl8B2YD/wfWa2MPzs3wH/nCqt/lNm9olztfuS2n/EcTd12cJdi+/j\n+rGX8MzyjxPaW9mQLmdSruMAH2dUX0d099D180xqjgLbeRPIMhZfTubarPhPk+QEwR+nl56msAUa\nNs6i3MFW+2HgN07327BRluOjtMN6Ch0n+i5BDxLSFgoWWOw9wPrm5Zya/xKHjo5wYOUeJlqX49Ic\nW8Y7/Le/fQBfXkeWeeZ1iZVnehQ2TypKgrSZWDeNw5NnGf1+QVmU1BotJiZHqNVr9Pp9AJq1Jr4R\nqrqtVFDzGYMiMT+/wMmjp1AnjE9MsmnzOvo2ytK9Nfp3jHDb63+Mffd+jq/c80myuiJBSbMtjrsP\nsUlfTyH3kKWXkvwpenIP6hdopBuo61Vk7gRd+QO8XIm3nZjuA7eOZ/gLtvAGQtkguQ4jtpuBW6Rp\n+yh9k53pdUS5B2MTPm3EUwM5zETYiIkjK+sMwtOU8jTCVYQI42k3C9k+vN+Gj026tS/SokHi71Hb\nxpK/j0bcCtKllrZwNPwmLbuCei0nS6fOkhWTfYzaS3hls48vWqi7hygd1JZoun/NUbmdlN2H18tp\nM0m0o1UOyYQJ5znOHjak78HLOrrZFwm6nzHbTinHuMzewCn/fpJ3jNtVzNP5+skC/BHwX4A/WXPs\n54FPmtlvSvWbkT8P/LyI7KZ6av5uqh8w+pSIXGFm+uxGl+33iekmRqREi7vox2uYbL0ORw66BGJc\n6f8FjxW/xeXyP5GXkwx8wYb0ag7bR5n0l9LkCUrpk9n1EB9EZSfTfhns2+n5u+nG/ZxMS2f3m/rM\nhg8x5W5k2l7OjHyQhm2mXn+CFJtsb72BX/ntP6G/Mkmvd4TDp4RBZy/ic3q9PyI32HV1nVpjjFP7\nn0Zyo13PSclD08iAUA/kec7kxg3kebN6rrNGlIJGrc7Cwgz9lS5kDiuqp7+EVpPRkTYbNm2gWdtC\nnmfU8gbLiyvMLJygNTGFFpFmrc7V/+T17LzhtcweeIRPfPh3sHabje47cW4HEsc4mn2YSdvAqN1E\ntAyxe+n4PdTKKwnZK8jZRE8eQqRLsjp12cYh9wm8a7DJXsYJ/ooN6dXUbT1Od7Lie4zrq4AOXfcI\nTbYR2cBWN0miz/2d3+MlU9/LqG7HfAdJAtklTPAyzBbQfARNf0fiBAuyk2m5lSnt0XNPomEvtfhW\nNsuPssTHGchxci49a832yKe5Qm6jll5DcjWSbmPJP0JDDOPPqOuL8G4P0T1M5A2I1sntICZLzPI4\nCxJYz/1YnKLMZihknpHyDRTZLPAUczLHmK0nyRJjXMv58DXJYmZfEJEdzzr8JuCW4ev/Bnx2SJg3\nA++z6jHy+0XkKaqfzfuqkFQzbKb0e5iXq7l5+mfJ7XuJ8hSDtIB5xaeI8wfI41YKPUbm11PYoxR+\nmc3l9ZxyHyOz19Kwmyj9vXSsS91O0UsnGbUdNLmSpjRQmTt7PjLDtNvFFD/AIu8l02miX+Gw28sm\n+UHufeIzHD10FJ8r+CaaGuShoLe4jK+1uGTXBubmjyKnZtBMaNdyOt1O9YT/3jLOjLKoE+s5i0sL\nSDaClAVlOaCRN5gYH2N6aiNiShTIncdpYGFpjrb2OHRwlno9J69Vj3clc2zdehknjx7kiWf2k8wx\nPdliy7ZL2HzFbt75M/8XM8eeIKUePfcZ1D/MSHw5/fxJSv0HlmUfm3gHIb2IkI+ykj5A0sco/XqQ\nk9T0JXj/OUatwSwHGViOd0bPH8bJ95OVC5h8CbMuC+EY8/IpWjRpljcjYR31cjNXTH0Px/RT7JBf\nRwb30skLcrkZ0RMU2qdn9zHt30bSY3gaGP8AuhuYo7p3pYPFU8xLn+3uJtTtP2vNdrk3Uo+3cpQ/\nYaNtpeavJdgyR9xjXGrX0XN9tukboJyhzDZziN9lm7wUnzYxLS3G3AK4KYRXMZLmcG7A0eweRiRQ\ni6+h5R5gvfwSGu9lOax8/WQ5DzaY2Ynh6xPAhuHrzc8ixnl/Iu+Zo49w/eWvYraY4lD8NOtq1xFS\nh7rsYsXfRRF7iG1kzG0j6XFMp6jzbRD6LLkZZnpztBqXkLkv4PUmRniYUh5hmYIRdw2z+sdsTNdy\nOP/rs/ptOChdg0OD32CepwjBsT2+lVp+HFec4Pf+yx2cXCjpLs8SXODSa6/l8T1PsHHTVmLqEPvK\n4qkO3d4BFlcWuPalr0AUuouzaGjRXV7E501atToqQt5YoVlr02rXMYFaPTDRbhNqDu8DURMplTRH\nN9Gs50xtyfA+UE+Omfk55ufnueeuLzO5aZzrX/YSJifGOHZ0lofuv48kNZa782zdvIk/fn/g8p3K\nTTeMkUnOMzzEVnkD/VhHZJKBPoKUbRbDPsbdehq2DWyJQJvjxZPkNsZUNsmiO8Q8Xab0ZvLwCZ7O\nPswl9uMclQ9Rlxrb9HtASgrp0KPDMkdx9hDreDs9+Rvy7HU84v8lN5QTlFyNyDFGw60EeR8n3D42\n6y+CXQXhQWpSIOm1FNkXKbifXnqMwwzYnm47e810HQO5m63uJ0nuc9TsLsTWMSYL9E3YnH4A7LOc\ncB9lveZc4m6mzynKbBHRyKKdZKv7PtASJ+O4mLFR4WDj1xjN38rWwTuAQ/SzA2S6fF6h/4YdfDMz\nee5f8jrnZ5dPvINiMM2hg+/i0i0/R+ycRJvGcfs5wuJWRkdejbpTGAeYcD/EqeJDtBs7CeUOnCjT\njeshfpYl6TMer2A5W8LbVmpunp58BE+bBTnGhF1zVr89v8T68gYa+d1skDei8VoszLGOH8fnG+jp\nPWQuZ8PWcY6cOMgDX/oCrdEplleOYRZYmJ+nXs/w0bN921ZqXoi1Bt5B7A4YiAONqA5oNsehrO4j\ncVmNZqtOkTo8+sTT1c9PpOohFHkGzVaTVnOa6Q1tmrUxTi2v0I096mMNrmhfyrot61icmeeuu+5n\n09Zt3PBtr6Idahw+dJInntrLxz/8MT7kO/zNi2/k7W+rc/WWaQpdYFP2fdgAnD+J95uYLx6iHW6k\nIT16PuGZZAc3kaxOM12G0y9gtoh6w4optsq/InAZm+Rfc4JfYMBLaaYaI9xCX75IGR6hHV8J4TqO\nFv+AZP/AtF1BxBNcH9HqFulYfgfBP86c/DajegvL6UEm+V6K8BFWOERTX86L0ivpuZPMhq+cLaQ6\ngdhukg3wxbXg2hzP7oG0TJ1L6bsPYXKQll5KV3oY93GUeTanV7Lg76GQHsfsL9gs78TiE6gEJB9l\nR/xlZuULTLnrKThMK72c5fD0eQX56yXLCRHZaGbHRWQTsFp99uyfyNs6PPZVeNcvvoeRqY2UkviO\nmx7mNd/5NmbirzPl/wnN5q2UuodkIwR/Bfvcf6VR94zoO+m5Zxhxt1FjD7nbSZ/H0BCBSFNeRbQT\nLDLLRnkNhT/CQbnzrH6n7Hs44f6GifhtzLj3s56rKeQpBrrEJ+/4fSab63j4+BH01BF8bRKpO0Je\nMuhF8rxOt+wwWF5m564rWTp5kjL2iTFRFgmNQq2Z0+/0SVJpm04QFlbm8KFGv7NEZ7FHPctwecbY\n+DitWo253oBgMDJxksOHPO3RSdZPjDE+uZ7166fwBvv3H+PU4gKNeotBuUyzn9FxkV1XbGbD5lGW\nl7oMepFHH3uMd//cV8jqwo/+6A5ed8MWonuGOi/GtMVL/btYdHdwtPzvbMx+BG8lKe7E+QWCbOdU\n8btM1V9Hx+7Du1vwmrHMAZCHOL7yNFPNgEnE/Ck0XU+bqziU/RpTaYJN7vUsp0Tml5j1H2aD/hBL\n4UtMxRP0PEzad9CXyIz/IBviT1Jkib4ZHdmHuQznlVlOsDn9NJVVXyHZOMfCn9GSy9mnf8l09j/T\nSK9gxT1G0MPgCnIyCjnJdHwlpt/NSP5xTtjnqaVNtGWUCb+OOftPNPLLqacr0dTFhyto2gE++dnf\n5hNf/CRNexnz8oXzCv3zyrMMfZa/WxMN+01g1sz+dxH5eWDczFYd/D+n8lO2AJ8CLrNndSIi9tAD\nv8uVV72VothPo7EbFSVYj+Od/4NW42o6NmA0vJjFwd/hG1OICq14Iw1/OX07QuIYSXM0eupZC+em\nEecJOo5Ik+RWEGsyt3gvG37n2073feqXP0JKy+S6hdLtYVKvR9zLefDR3+L33jvD44ePMHtoL6ks\naU9vYHRkO4NiQInDnKOV1SEztlyyi6AFS8sDYoxAjveRohhU988P77/JM0+/SBSD6sk0o60RJsan\nmZ6YpNkcJUYY6IDYL0lAs9kkq9XJykRpRiIxtX6C3buvoOgY80sd8kbGocMzDHo9ypRoj7WZnqoe\nl7q8uMjhw7PML8+xtHgCpOQtb97KP/2uHZyQ9zFpN1CXHRzRO7nEfpxZ+TwTvIYV/UMKGVB3l3CK\nO9nEKA19B3CUriRyOcwx+Rwr5UauyP4tA90DdpS5Tpct7ZtAp5nlflw4QGHPoLLAhP0wPjWp2Szm\nrgM5wIB7WHHz9KVgo07g422UMkLy97Mkn6ZDQZRlrvrV3zu9ZrO/9AHm7BOM29uY9Y+wkVuZKT/I\nurDEsl7GpIcOAyaKN5L8KUTncH4DqgUz/sOM6A/Skgk68kHytIuu309NpvHlVThZoJQaffcVmnoF\np9zn2eZ/6evLs4jI+6ic+WkROQT8MlUs9q9E5J0MQ8cAZvaoiPwV8CgQgf+XuTeNsuy66jx/55w7\nvXmIOTIiM1I5p1IppSRrNJY8zzbYgAe6TNGYpgFTUNSqhi6GohvMUFBVXVS5qO5qypgCg91gW3iS\nbMmSJSNbUyqlTKVyiox5jje/d+d7Tn8IWSaE1V+qe2Wfj2+9tfZ6d+/99jl7n/v//ewrE+W769jJ\nn8LEKTEl8tIljedQ1h5q+Z+hnzzEkPVuMvM1tO2i6JI3N6CUwhdXMfRxGEbKKTK7gSUP40cvUlA1\nfLUIJkOmgq56glZxYZfd1eAvOWp/nGb2VUbFGxDqMCvrT3L50SXWOgGYHIXqFH5nHcsU6HU2kNLG\nLVUQwibym6hcDpF2aLTb6FRisBAEhLEhjRJyrkemwbYgjSFJEiSKaqnAgf1HKFXqOPkScRAihaAg\nK0jbIu+6FHJ5nLyDcFyUTsky8AOf+cVtCpUcY3tKiABaRTh4aD/tbZ9Wv0ur1aVaK2G7OQ4cnGJ1\nM0+xWACR8vijAfd95Vk+9pNvQJ88A8kzDFlvoyEfZNi8i3n5u8yIf4rO1hHao8QbQFkE4hGkqeBk\nNzGrP0HVPcr18gNkaR8pyljyOOXifaCHQQTU5c2Y7BAKjRGaRL+IttaI9Bux+AoqvQllV/FMG5+Y\nVEyQWZoV+V8Z0kfBTDKBwzyP7PKZwiUSKQ6wn/di62GkeC0JTzPB7QTZKhWxyKrzt4yKUTB5yOpo\neZGyeT+O3iBVSyhdx4gmjr4Ox6RE6hmMOYaQawRmnZ6ZZyL5SeDXv38uXKsJfhon9INFhBdA6iJ0\nB8vWCNmgH4/gOHU8crTVN7FMQqY9ynoKpcZZ1Z8iJ4YZFv8Yoy+SCRvBJFq8gDF5YnEeh0MEpoHS\nU5R/79TLtk//wo9xovjLdLN1yrJISJs/+1d/jFMe4U//5iIitwcha0TBOlnax6KI65Xp97bxigWE\nEWTGwsslL4lGVLCUwRgbbStEuqOIIpQCk5EkMcoqUK9W2TM+QaFQIUtTJDYjE+NoI/E8G21S+oMB\n/sBHSgvXcXCtHL1uh+evXibpt4mCgJHJUW6/7S5uPHGYtcUmnX5MfaTE0uIKKYZCPo/rOSRxhh8E\ntNsdsijCLZXI5wVdf5Ff+O9nkPl1avpuDD6huB9bfgjSVZRpINTtaGMh8REij6AH6QKZDJDqGCbp\nIUWZgfUXuHoGYW5DiQroi2h5hJjPIsRtKDNLVzwF5iAFUySwtmkyj8ckljEMeA5pXNLUYOyIvfrn\nWZOfp2bupPrbb3rZZ91f+wo5Y6PlnWDAStdAJwglMVoS2gtY2DTN5ymoEwz05Z0zFR2K5u001BcZ\nxifSe8jpaWJ1GSkUSaYocJBMDOjIr+Gk+/HkCK71lu9bWa7ZRcqnLv4OtqpimwmwVgmcBil5Ar6N\nY3kouiwkf46XzYDJY4eHwKqzxmcpiL0obRPLB+hxEUydjB7SHEBZxxkkCZlIcc0YNrvfqT5Z+edo\n0aeoaqRC863HH8FVeTqdJkEWkwUtjOhjZxLXrpGv1hn4m9j5PFkcUChVMWKH+eEIG0hQEjJhsNIY\naWssS6K0IfEjXJVjpD5EtVJmEPZYW1kkMRGx8Zmfu8TS4mXOnTvNc8+e4eKFc1y8cpZz507zwoun\nuXj1WUanSrz/He9iamgvOSvH+vIa933ur/nzz3yaK0sXmd5XotPsUq1VOLR/hrxt02/3iYKAeqVO\nsVBCC4elhXkuvbhEGg/xnz/ZJe1kDPRVUiIK5sMk0VUsU0eY/Qhh0GqVfrQFWQOTLaLVMYQ4QWoe\nR1gOWiyRSz7CtngCjUNL/i7b1pMo/S3c7BQd8QVkWqecvR4lLZqqRZ8G851H6Zqn6ZhlHHOUBlep\nq1PsMe/G5wlC1ihwbJfP8lmdS+pBuuLPkdpBiFXa3kNoEtASN/Po8wwlc5xCNsVY+ku45nZckWIb\nm2F9mAYrSHEDWh7CcAN9vUbRWKyr/4uUM2idMW9/iSXxzKvG7LVDTkys0uR+Rq13cHHzLGPDEwz0\nLLOX9rFn4gJ2MUekYiLxFKHQ2LpAPruOnr/OsPsG0rhHYNYoWz/IYvuTSEczlbuDNJ6j5pwg1Rto\nqQjY3Vkx6TiOHCLkAXrZJea+vYhlbLbXtzAD6LJO2VhIYWF5VTARrm0jRUqn51Md9bBEQJKGO7hE\nE+P7IY7nkqk8IsuQShFHEcVChfJwDVtZ9Pt9xurDFIeqRElEs9lEYsgXypQKw6Q5jY5ToMFmd4Mo\nCul0Oqyvr1Me2sOpUzdzW/F2vnTfF9jurPPi6bPMX1lgfn6OPWN7ec3dt7J2dYteFDMxNUm71WRr\nu4EUhpMnjnDmhZA4lnQ3NtGlMl956Hre9Np1psduJjJ/ytPPPsXR4z/HUPkglpHYusTZ9js5UP1l\nUqvJkLwetMKIEkn2NEr16chvM64/zGr8SSbtHyG2ErbVZxnSH6GkT9B3ztDJFsF0mEp+g9A9x2ty\nH2STv0DqhFHrPYzxTmRUR6gEi21cs42hvMtn2rrMjDmJlC6ZlizYn6OoJ3la/BLHvfei5d0Uk/dg\nyRcQ4h4Ss4o0m1j6NjbtX8fgU07eQk5NoE2DWF6lZbooe4GGWachFriOo0zqm1nlH4C1X17XLFnq\n4RuJZJuueJ6R0ZsZSYdJZcqBGQ/XyuinL1B3humEZyDNM1H5WUT2IpPW2wn669jeEeLlFbb0XzIw\n6+wdfQdB6CPtKezMx1AjM4KaNbrLbsCLFNJbMHqUJ7/+OEmS0Y8GdDsB5XqNwO/RaqzgFipUnDw4\nDk6+jtY+laEyUdTDyB1asdCaOEuxHYew38XYAYX8CKk2yFyeykgZ32/TzySVXIEsibiyOEfBdXa0\n0RyLQezjhwOKxTHytTIjk1WOHL4eS8PpM8/T7zforK/zjfW/Zc+e63jf+36Ev/7c39JWiwwGfS68\n8ASzV69y9vwLHD52mJOHT+D7ffx+j1M3HWNlYZtme5sDew9Qr1qcec6h3d3kqaefo9Xbzztff5Xr\n9t7OknU/Jf+z1L13Y6wTLDQf5FP/6SI//KGHKQ1HiHydqnsPjrnzJfGM88T2N+izQDmXh3gUk6yy\nZJ9jSE5hJz6h1aPKKYzoYtQV3GyGyATs1x8glDbKjCC0g7G7yKxBpjQT6Q+j1XO7fLYiL+HomGF9\nO5H4FNPm4xjzAoetGk09j9F/iS+nGdMuJllh0/0Kw/ExHI7hqWkq5mNkeo5A3IeQxykxhBEHqSU/\nhCMfwwiPRf17VK27mTT/DRP8/69WeXSeAu9G6AkCvcEK32EzfoQx63YGMmZUvYWFlT/By45T2n+a\nKLyMJQ2OWyLnHSFmhOK+CqkqM5ws7+ggO8MY08XXAzbDRygUpomzK7vspnKJvllF6JNsLm0BMWG4\nyWgtz5WVTTqJBSam115BkVIsHyIVEUkUoJTEFgmO5ZIIdkQghCbzQ2IJOS0I/BZgUayU2VxbIjMx\nrlcjzXk8++Lz1GoFAj9H3q0gfIW0Ja7tkcZtlClAbIEDTiHPG990L7Vyha89+BDrG8ssLM7xtajH\nPa/7Afygz+OPfZNBYpMMOmwMevhBj+XFJVZXlzh+/SmqtSFs16Ker7G8tEI+N8YttxxnabHG3MIs\nFy9cYnstx8d+fj/XH8qjIocIxXbwCPuq07z77T+JK0Ki7RgxoUisDfzsCQr2naTxZVw5RkN+ma3u\nKkHvL7hu/D0c5cNkyTpITSaep6ynaJkekuvRokZZNDGijiccNBFCb+OLR8jJGMucRFiTCF3d5bOR\n7G2kYhWRlPBEGZWdY0M+TNHMMK4/QN96iGp2mAX7i1TYT90MYVs30JB/hTYjiOwMyjqBokdHv8Cw\nvoOaPE5iniBn7iLJrmJZVWrp3TTU+VeN2Wt2wI+SLa4unua6fdfTjL5GR0eUVBGljxAkl5k7Zxgr\nx+zbcwcmfw7PuZdUz+68q8ApMhNhiRYZeSxRRGufNJujnVwi56a4HGApe5hgI+bkp37zZdtn/9m7\nOep+ji9++V+wcX6OxqCNSlIuzy8Tao/zcxFKDtGPAvzONlMH78TIHHG0ietYWMpDWlUEIQKfOLHQ\nRu4AjuIAk/OwtUWYdhGpxNgKgUsadXGLVSr2CPVqjSP7r+PUqesoFitcuHSBiT1T3P/Vx7jxxM3k\nqlWCBNIkRTkKpSTDQ8NcvniZrz/8AFMz19Hd2ubNb3krn/nC51Eqot/pgIoZHz/KoLlJz19m6rrb\n+fEPfohuv0+pWGFtfROlDVGmKeQdlq4us9FZJ8Pioz8+zLG9VWyZQ8hDaP0sigMYWScxD6PEKaQp\nEYk5bIp0eBYj2hjWWYsuYzkBRXMTlfhmWt4XyIkZclmOvPgAofgGTnYAm/1gOaBTkA5CKLI0RNBl\nTvwWaZoirBEqss7Eb/2zl33W/rUHCcTXGDHHQR8jkm1SvknB/E9IXBK+gRYhGzzCsJgmFm2K6f+A\nUANSrmDp43Q4T0XfgrZSUtPAMTfsUAwQaPMoHVFFM0vFTOFYd33fA/41S5YwWUfoLpFKicwqTngS\nXz+PI5ew7HFEdoSt1WVGJscp2B5B1kGoVbQukpfHyOQyWgcYYWPiENe5kSi6QixbFNQBWlygInL4\n2Qq13//Ay7YH//M32VyIefSBvyToddjotOmudbBcl5WVFoMo5uxqjBJltE6pjk2AqRHHAxxXkPcK\n2J6H1nkS+phEoxFInaENpDpDZ3onIMxOUOioh5er4OWqvOveN/PRn30PnuPu6Hu9hDMKgxTPewmt\nIXbEvMM44dLFLpeudOhFMblCDiuRfOqz/wdvf+uH+Mr9X+CNb3kLX7//Afy4hY575MoTkKQMuoto\nu8jtt7+Ft7/pjSTpADfv4HcTOs0uVs7DtV0sfB59/DTKVrz3rXu59ZYcWzzOqPkBLO3Qt85SyPbv\nMGX0EEaWCbMnOBv/R44W3scWj5Nngu30EiPqVsrhBG37ebAmcAw4TJDJyyTGo2QOI7DJ6xNoWcSk\nV8hUD0PAuvksRXkLytgoDlL5+Peuyq/+xr9iYFYpU2Mo/QCpEGjrCmRjSDnA0gmpsXHMQXrWp/D4\nIEavIigQiVk89iKzUZDDIBy06aEooEWAJI/OniGyruKa96MluDL3/6/3WdK0SZL6aLeF7w+Y7/0q\nOS9jX/7X0DxIkij2XfcGouzbdKxv0MRlKL11h4JLiNJVUusqJp7GdY+QJBFKjSPSbbazb+JKRWq9\nEU/uVusIaPPAlz9JGFlYWlBzy2wnbfpph1rVItiMuW5I0Op2WO9Bv9WiVK2BNighyLIEQgvphZjM\nIst6kFnEZEgFShukstBSkyQpJojxKsN4hQJ76/u49+3H2FjdxivYeI6DUjaOY2FZNtrsAI12VCYl\nnm1z8mSdkydGaGwP+PwDV0mVw09+5GM8+NA3sWzFV750H29/x7t47JuPst0OiYIu1dIEfs9CB20u\nXSGDxOEAACAASURBVDiNFpqbbziMG9bJK0PdGVC1VpiY3IORkhs+/DqeODvHw4+scnG9zDvfdQWH\nHyaTFlY6wcCaJRNj5IB18S+JaTFVrJOZBJcZ6vpOavIe/HiexJmnIk+RNyVMlsdXzyGMgxI5AnEG\nbUJMNodJytju9ZhU4Vh3IuVzlNJb2VL3odLdlxkzc5mc8JCmzFX1W0yIn8YxQ/TkfTjsJ5Utcuan\nifSX8U2Vgs5hREAqBHlz5w5k1xpBZD5IiaIK6SxSjhCpzyOsG7Hp0xb/kdDMv2rMXruXv4IyhdIk\nW+vnqA4bqs5HUFYTSwVk7Kdo302cvoBjNEk6jpNcwLWLtMLHSd08W+l5knaBA/k6wskhZIi0irjW\nNGt8mil+kVR8GSfZfTes0TyDMHk6vVXKKkevPcB2BKnvcmV5g0rdQ0QujhMxVNKsb6+QL06Rpik6\ntbFcFy0UUmtM3EW8xDlMUahMI3I5Uj9DCBvXLmFXq5gswzM5/tff/mlKhSKZ1DhYO+yZOCaOs5eg\nrilSgm3bWJb9Mi4QoDac46d+7Hp+5Tc+x5zr8AN33MXW1kHu+/qnuf9rX+Wuu1/HM08mBGEXN+eg\nhETbLr7fYnt9lTNBn4+8aYYj11mYfIUk8FBeTNDaJF+b5M135Hj/227nudNL3Pfp9/ITH7yIJe/A\n5iYccSNS72CYZvRvYawt1uMnWExOY9QWE9YbUdSYSz/PuHMrQgb4aYDLbfS7a5j+jbQ3trEnp1H1\npxgMmnS4StXbi+MKRpOPMCJfS0M9jCdvomg5u3yW54O09Z+AsAhFF8d0afIsgYAR3ccxb0aaF7HF\nPQzRICLE5jpCTqNJcXQRYS7ji5hcNIJy9pGpHJiUfupSExlKjmDECgWO82rrmiVLvjyCMh3yI5to\nYyHMJsZM0uEzFLNxAnxiZcD0yIs34xNhhIPnTmFpl0PuhxE5i4DvMIhfJM8IW/qPkEoxw70YoBOF\neGr32wG14q3EyTksLVhcX2Nq32G6W8s0WxGJybOxmhDpCCk0tYLF2oreEc/LQoxyyDKNV8wT+Fuk\ncUqxWiQIdmb4UkJ3sI00OXK2h7QEJAFGaE4ePkbOdSgW80ixI3YOkMvlEEK8zJqB3Qr73wWKSiMx\naH79l9/Fz/8vf86nvvhp7nzNXQwP72fp6rNcuXiR+sR+mquz9PsBsbSx8yUq3hD3lGP+8a/dhfQq\ntJ8/gx0PcO08/aBDp7vF8L4bsHvzNNYXOXSkyM0nxjn/okPuyMep2TOE/RolT7MWLnIo/9NoHMrS\no5i7gXL2NmIxhDEvcrTwe0gteP7ib2AvV5HJPLn6CaYO3kx76+vMTO2jv30MJUDrr6Lq2yh9BKwG\nq9FTeNKhYb5BQf7TXT5riU8yLX6Rq/oPmJA30zVdFE2m9W9zhQ8zrg7jMovKFhDZQRw5jRLnEazT\nFwOG5SlCdZ7MDPDtg6Q6oKj2YqdPUlI3QZoSSw8pCmTErxqz12wo2YteoJHdx0Bv4FmaF3tf5onO\n79CMDxGkJ5Bihsgs0A4afHHpB1nofxmhXTZbyxTUO+hkZ5BJj+cHv4Mty2RiAVseI8tSOplPLD7D\npc1HSaOJXXZ9ex6/l9DdDEk0rGzOEZg+E2M5qlKTDCK6bZ9WO2BtaUClqEniGKkUEoNlWwRBjyxN\nSdKMYOCTt9TOAw9b2BTIF8s7XTKjUdJChykf/PF3ML/4AnNXr9Lr+mRZ9jKI5++DdP4+0euVVC4h\nFLmizc2TdXJa8+yTj3L46GHc4hDLcy9im4xcbRxMiiMNeeXx+z91PT/2c2/GmCrbLz5PFHYpjY7T\nXl8nGQwYrR8lExkXnj/LynMPsnj2KXpJi+mhATP8JIP1e6jlRrDsLgVPIcQclt/B0q+hyN30zRUc\n4RLpac587V/y/F//KqWtOo7tMnroegrVaRYuP0Z52mFztcHm/HM8++yXKfWOs/ilwwxLHyGKjDin\ncMVBDuhfopW8oiNlKthJjrK6lUr2WobEOMPph4CAPfI9hOYJJA2+EvwmQg2IxGfI9DoFfTMlOYw0\nMzjmtZSyn2E9+ypV+Rp64l/TkjFCzLIi/5QN9TBrZp6Kfu+rxuy1k28VqwyCKlGgaXdDIuoMZ7dT\nTY/TMg/yYvMTnFs+zaPPfJ5bq2/BSg+xGD5Cxb6DRK7hUGRd/xkz2fuZC/+MTLxAEF3mOwtPM4j2\n8fULjzGUTZKzdvfNxfZxwqRFbWKEMIsJ+z5KZ5TqFrfccZx3vPUmxodKxFqRSEMSaZLeNtLs/MNj\nIrI0JUWQpgGamFhHWEIhZBFLCQZ+gJQOxeIQUdinUiixtvI8QS9hz+QeSqUiUu6wJr8fRm8HT5Ht\nwlFkWUaSJGht+NmPvQdhPFLp8shDX2Z0bB9xNmB+/jJxt8Fgax0hJL/yvhuYuekW7FqV/ux3MH6b\nysgwfquHsBS1WhW/0yJdXyCPZO91x9l35CTd9Tn63WW6zSZVO2ZpVqOTo9jBfhAxzcIakbxKJwxY\nfPZJLnzr33D1a/+eurQZZAM6qw10lrG1MIvfPI8VR/idPMuzZ4gGPdJWl6XlOSpextVnRiG+mTBw\nKKQeiDpD8qZdPttrPk6sYspGoKRNqK+g5U1E2Wlcc4A1ztMx1/Pm/L9GmwHaJEgOoPQ4uexe0AsY\ncZzU6pK3bkWnF7EooYQmEQ2Mkkymb+WY+SiB2D3E/vvrmm3DzsffINeZYaV3lpVwFn/dpV4x6MoZ\n7p3+Ya5snsNKZ7BVg0fPzzJaO8Ta1hMcKKfY0mG98SzCgW+t/z7DxesoDv8c/UhTU3u4fPVviLII\npzDEZnNul92vf+m/UMqN0vO32TtxgNnZWUYtizQSLHW2iE3Mm99wC5cvzrLUznjhygob61tMzIxh\nKQ+EtQPPyXb4L1FvgFWAKHOJdUqpWmbIgJcv8wNH7sZyEl64eJp4kOOtb3o9O0Tk3biMVy4hBJZl\nvQxaVUphWRa+7+MpB2kp3nL39Xz9mQv4vWWcwo1YVokkbREHKYETMykL3PnWWxgsngfTwvNqFEeG\niSKP3uYCxZEJ0ixPq/Ft8qU78MM2NWscEbbJWw5+4hN3AlJdZqbisbUi2TN9O834Eva6A2lG2HwC\nS5cwJiQVAVs9hcZFOHKn0mZ50rBPKlO6vVka3ZBynCCkxebqKlLYDJuIp7/4CW5+94/QMucpsorH\n5K7nIU2KNFPY+qNoZdDaIrAeIq/eRE88gMgCStxNZL6KJ24nkd9mVpxhf/JPEAbm3X/HdPYTpGKV\nIb0P377EBf88w/km0ngM6dsI0WzxV9hi/6vG7DVLlievPMDaVoMkGCXeTrCdHLODHns6x3lePcr6\nQkBPnqWe80iziLnZ89i0mHe2OTi2xUzlCH+3/DjXj3+Qhxb+Lc9cWeVNh/47Lp7vsdrp4OlpvFsG\nbMX/bJfdjfUe9UIenRTZbDWoFEr4fkQ+iXDjkHarwwOzfe65eYbMatPrVxjEGSiNdCRh3EWaPFGa\noqRDloXESYRlg8bCSjVTxSP8m3//i5SG8i8hy38MKcBgdrT+DQjxPR7k92NcvvIco7Uml8sRxxHa\nxHz0f7yXBz96EWM0SwsLIARJlNE1XUQc88s/cQudjXkqYzNETciMorm0iOcUKYyOo/0Bq+tLTB86\nhSRjeGiCzC5g7AK626JYGCPutLArhtbGGqW8RX8jxdo4Ry+NaLfWSLUmDCMiP8LzFNtbG1hJjEqK\nlCo1HKdAlvqEnTYuoxT0Cn6vQSZsFJrt9jKOregGEWcevY9Tr/shhBrGGLXLZ8IYjNgBI4lsi9Ta\nQAqXK/rXSZp72Df0Ia7qT3CdfBdb5iy2VhTVHrQ8j6DKSPZBuuIqw+l7gCUG4q+ZMIdxzCRxb4NZ\nPsPJwp8zZgok5tXflLxm27C0c4icHsEblMgN93CjAn5nk5XtWR55cpWLi5tsNhtcWeyx2lxlfm6J\njtpiYyPlG889wH956hPMXlmjs1hg4Ykj+M2Ezatt9ozezthQlcHQGe576j+ztr37wRtjCJKYeqXK\n6NAYuZzNdjAgCAZUh6scPzBDwRG0A0POShmpFxgqV7BIUCYh9UP6/jKW9MjiFIGNhURaDiVR5kfv\n+VH+5NO/Sm24hC0tHMt+CQJrvcSzBPjemeT70sReWt/blkGa7oi/KWWhtYIE0qRHmvgkvW1GxqYw\nSQp+iK0kJ+84SdJv0L76BJlwsHIFSnuuwyiXxtxZ2p1txieHwR5i9txZKiMzpP0O0hFoqbA9QZh1\n6fbWKFUEztAQ3aVHCJVDJhVK21SrB7AtD684QqvZIRMeGYYIQT9MMWjSVBGnhjCLiLVFEGpcW7Ky\nuETeK9HamGd6ZAKZKhYXHkXoDCvZ2OUzRI6MPpFcAHWMUjZGX59nkhMcrryDyDzNEfE+ksxhIB5G\nycOIDHriOZTwKOnXU9cfQCvYsmaxs1swQZkrG49j29exr3gDc80/oOk3cLNXR05cs8pSk3Ws8VXC\nlVECUUNVFLX4evxWHj20hGOF7D/wGp567HlK2w4HD1bZDgeEjTZj1VOsz2+SuAH3t/6MoGEYqtS5\nsHCZXD4kLC+QpjZ55dFb3/3jS7kCEk2v3yHLEqYnJ/EHFgsLV7FFjl7ok5cJZ8/Oc+dtM2z2t/H8\nGOGFDPp9Qr+3g3irB1hCYCwH4TjsK97A//4nv0K+kkf+PZIw7OZUfhcrblnWrmR55fd2f6ZRyqLT\nDaiUc2id0E0VtZKFZXnE6Ta97jCW42CCNjMVCyUF5eIY7a1ZwtY60aVZ7FoNckN4pTLYVZqdJtbm\nJpOHD6IdQ+x3KZROkoQh5596lAP7plGuj243aW0vUsiP0vUbbK+ugXZot59FZ4Iw9rGLeYKtJt2N\nNpYfY6wCQ1aZQWcLT1UROYduv0W1WsAPehhScp7H+soG5559iJljN5BvTJDuM2hr90XKWDyIoUTM\nZZzMoS+bLMdnOOJOkugLjOqfppF+hyevfo2Th29AyxGGRJ6WPk+H56hYtxLodTxtKKi92NkUuuZx\nJ/+cxDyOzfup1jo7rWT56tddrlll8WWDQ85JyofKiNIsL1xe4cLFVQZyG4RisFniysUt9o8cpjph\n0cnmGM1uY2z4MB09R/WIoXS0izcyztGbjhOVAuzyJr7foLsdEaeSlUaI6OV22XXzFjLL2Og36G53\nSYIBNx4f4yM/+A5MkjE6UmHPgb1Uaw4PPnaewaCDk/dptbbQSY8kS4kzgUxirJyklCvyCx/+F3zq\ns79OoVzApAn/T49VqZ1K5/v+zuzmpeQBdnXB4HuJlaZ6B6eNodHqE2WKJItod7ok6YA0Dhn0FkBm\nRMk2P/WRk7S3rpBkAdXRg7hSYVUmiaxRgu6AcCCIe5uUXJfC+Bi97U22lq8yefQW/DDjgb/+FFMz\nE/RokkY+WWkc6dl0G6s0NjdxyqNoJwfOKDiKzMTE3RZp2iNyQmSmifsxzdYiJglIs4RBrw1pgiVs\napVRivkynsqYOHQUrCEGWy02rnyH2ac+vyMh+/dWW75AJCPy5gdJxQSNqMP6pqAxeBZPvY6+JRjK\n/yjvvP4/UFFHKKthmsyyFS2Rt16DMR1ypoE2bVx9hC3xDfppG+jgyruRegDWKJlVwBNvfVXfXbNk\nafQW6Edj9LNFVi6mnBgZY7IwzL1vCrj3xA9z500VRiYtulwm8STSLrHqP8daehocQ9IO2LqgyIVT\nJFaHiu0gGOOGo4eZW1ukf0Ui12tEVmuX3SgES1m4SlKsVMjnS3QHGecuXaQdbqLthJLr4lmaoUqB\nbgCXFnxcJ4fM2tiWi2PlSdOMYfUavv63f8n7PnI7SuwIM6hXDNSAXecQ2AHIZlmG7/v4/mDXfOW7\nybSb9Z4Rhyn5nEO3nxCnGf1BRrPdxSQJRg/IogzfX0JHhsPXnyL0u/S7G3R6DVRllNrUUcq1Ayi3\nAIUSQnoM/AYbV85Qqw8xPH6IcPM8T9//SdbXQ7aXn6MzN0egJMGgQdjfZn2rg7EKEAuSMKCUz9Pr\ndhmaPExtaIyh2hSuVaI7COh2GlhphrQUlcoQCE2WasKwg85AkLK63CRr+oyNlLBKJSxVYrC1hXqF\nzFyYNmkNHqat/oYXrv4NmoeoO3mmi79DK3iMAjOgDalp4HEPIj3GyiDP4dzvYictzqVfwJg9XLH+\ngq75CshnGXfu4bL4NDprcVH9Lib6DipZwgr+3xes+G9e/b7g7OKDHKzdxepkRN8kkBi21/aTlS4z\nOfKPePrhP0LU8+QHFkvLUBrSpEFKphZp9hIOHTjCiH2ArWbMSrjOnuoGc+J+9u6dwW3tw7jLDMJ0\nl900i8hZDpPFMZ6/cJFerczm6jZ5b5gbb7wZx3g8feZ5quUcbg5YG2AP5+mHKWmi8fIKiyH+wx/+\nIdfffhhLfReKunvr9cr1ykFjqVQiDEOCIEKIENd1UUrtMDVf+u4OlRgMgl6QUi44ZGj8KERjSKMB\nWqfEfgy6iSUyDD4yXCMJDI4liJMBQob4ica1h8hZBaTMSO0cOrYpuQ7dVovSkEWmAx557Hn2jFVp\nb8xTmj6Av/giqUlQpSmkA2FvlXYosb0hVhpLZFqwvrhAGsd0oxBjQoKwSy7nIK2UQm0fra0N8oUK\nURTT6q5RKEmm9h6g3/NZXZ7F9TTDpRF0rsvI0Almz96369lNi18gyeUhWubmmUMM1NvYN5bjha3P\ncKRyL7FZwZUTCLMXISRnOj/BILuBpLDF/bMfZt+koJ9/Pfn4DSyK+5E6oGA+S92eQRAzxb08EXyS\nW0o/wUY8/6o+vGbJ0hgEzHiHeeLyIwyP1nBKTSjkkINxzrzwFM+NPEd9ah/DToVLjXPUx0tIFVLL\nzbCVzDEc3wSZYTN7jsLB55m5ciNe3uL8kw3IxYxbiq31TY4duhP6T71sVxDR8jV5bXH3qRvYbPpM\nlMYYGR1haWONJAk4fuw60kxy7sIs5XIRO05QUuMol7e+4Wf4uV/9wA6RV363DZwC6tV/7CuWEOKl\naiIQWKQZZGHIdqNBrVqhXNy5z/bdCpSlhjhOae1MeiA1uDkX9IAwaOFZeVITEPk+XrFIomJSS+FZ\nCqMNrqyQmh697gqOWyeNUnBtXDeHqY6RhAG9xjxZuE1vEDExVcIt2oSDiEi5mLiLpXr47W3iNMNI\nBx0u4KkaG4kk7bdJlCHqRziyhGunCCJIBc3GKpkvGJocodcpUCkOkSYaJ19gT6nGUCWP3wlwSjZG\nC/rhgIq/uyPV00+iGCavhkl4CBkcJ3Fd7FweKUpYFMi0YCn9Fs25x1kUPgV3kyQJOTb6br518TF6\nE59gyKuzsnaZ6eHX0ctXcFSd1eQ8m/0NxvVdzLcvEYSNV/XbtTvgO6Mcdm+i7wywUxenFRP0u2zk\nn0MVLAp6hEy0mV/YQEqPNB1Ce3N4jkNeH6I43UUOalTrQ+iNI9jFNRrrPuPjOYwuUc5VmHbHCXKz\nu+wGcUrU79IMMsxmQpRk+HHM/MYq103toZ+EnDu7TKO9RWl4gtpQnv5an6Njt/Er/9vHGd9T3RH2\nRgDmpe6Wehmn/d2WMPAPOlu7Pxc7xGJhSLIdxmYaJzQbPYxWVCoOWSbQWjAIU+IsIe1npDrDEQJl\nCbI0QicJmeciMkMa+TilCh7DxGaTbnsNKT3iMELkHaJOG50DkSvhJhlaDTCDHtXSXvxog6sLK/T9\nEFvkiMM+wrXJuRY6zhP1WghhU3Bz9LptCqUhfKGxRYpXLGIJ2Mp8MDH9RkKmFZvNNlngUyzWiMK9\n5MtVwu0BUZyiVMhmo4HlWPi6iQlHQGhMu4NJd2/DUvVtFD8EHKXFXyELDYLM0EkinufHaW3m8IMN\nBsM2Nwx/gD39H+LhxT9gbWuVuvawxUHq9l1sdzLWFk+zsvgsd9x0jJWVP6XR3cStHGW/V6cdhbTa\nl141Zq9ZsrRbbb7d/ibXHznOUvsqncwFMUyjsUJlpECeMfr9DoFZZJDaVPObKDfFjzukZpvNSx6j\nZZuuVAi7Rb+fcHDvKc43zjDY3iaox+isQBzt/qfYXmuhZIrfCbE1SKUouiVsS7DWamNiw6Ejo9zo\n7SWKU2YXNO9/78f4sZ99L2C/tN363kE8y7KXhoiKneT5hy3h7x7cgZevuLws/ywkWZoRhwFZptEi\npd31yXkujmORJCGDKCHLdkTDtSWoFys89eTfkUQBlpXD6B3Ks7JtEqOIEbSbK7j5PEq54CrSzjZG\nuKTpgKgVYZWK+HEPR1UQ2RbFyjjr24/iJxERPvg9YiMQbg1RKuMJD+XYJEZSVKBtQzF3iF7rKcKw\nT7/rU98zRmttmThrIKICnZbHTTfeSuaUcHQPWahgO/uwjcHWCUmpj53Ps7mikEZjOd7OjW29ezv7\n2MIyb57O0+Ap7PTDfHXhD4hI2Vcao7HlkHaHgVGis02+6P9XBkmTREl62VX2FPazIr7DRvMqI4Vp\nNuw1pLb59vkv4IcNlv1LjOT6LBcU0yM3YKrtV43Za5YsSWaztnkWLQ3jk3V8c4brJz7E8ta3iYKI\nTfs71CcE9XyRwfYayj6GHlRxi12q1iEGehW76pGEDYLEJsugHawQDxR7xyfIV/YTDAa00wu77E6O\n72dx7TxBEOKViniOg3QUaRKjM4lnG+LUwsQ+BTHG7/+nP6Y+UgIh0EYjNCC/NwORUhLHKY1mD6UU\npaKHEII0TYmihCjeEbQoFAo711tgB6FtdqqSQJLpjL4fYFkeRoHlWPRDTU5omu0BrU4fWwmiKCCN\nIHE9Pv1/fpI4DUh1hmsp0sE2iVtB6QC/v02tPkow6BNnfXLuOBvNJpZSCGVjey6DyCLvFugNmnj5\nmPZ2DAgybTPopWgToJwCabFMQUuCtIuTQa/dxxMWGQqtN7Edhd+PyOIBy4tzmFiSc8cxUmFbgqXV\nbfrb58nXxzh88Aib2x2G9x3boSlrF5G1GR0Zwx/sNE+EsAleccC3osN86bk/Zm1rniX/AsdLdzGf\nLbEu5yk5hzkyciNnrjxEUFzhZO0uHjjzRVKZoXJbbHTXKYopNpinVLnEUDrNVX8LpZ/A6JQ47OOM\n1RFpnzAyHJy+B/jy943Za9YN278nZXrfMHl3lE5wmb3ezcytfoNmeAXcImSGJCnR32xw1w0/z7hz\ngnpuGCspYkzGVPFObJMjSPvI/CoTQ0eI1BZ7RgooE7O2OoeTpfivKOmLjQsMFcd2BmNr2yytbNJq\nbtEPAtLMJzUKXIOKx/nwP/lDymPuS4kBSkqM0LsqhRAC21bUa0WUtNlq9tFaY9sK1/OQlstGs8f5\ni7NcXVoniDVGxyTJdwlThgyBUhJp21jSI+fl6bRbnD1/kWZ3pzW8tHCZxuYiQ4U8aytXee7MGcKw\nhzQanaZomadg2Qx6G/jtddY31wjigCAKcGzB+J4pXNuiVBonSwwizVjZWENqidIOSRQTDRKE1DQ7\nEZZbJeo2yLrL+FlIr9FA5asMdICWCe3NFZrr5xBxhOdUGRneSzmvsAuKwF9j0F9nZW2ZS+e/w6k7\nbuP4a15P6geMjtZBDDBBhOu6FGrT2NUR7NwQiAwvVyTv7Q7LlSsvcvriFTqphWhM8a3l8xT1EQZ9\ni55veGzhT5lbO49enuTBS19nyL6RRrdP3LMZNPsMlU+QhT2S9TG6Gy1q/Sq2ifBSw3R4C/G2YGvN\nZ8iuEm2uvmrMXrPKgj2NHrTYTJ5h5thhmtY6tx79MC8sfhPhZfiDKjlh04lHWJj/Owb6MjfX30Mv\nrnJ+6zTpSJPl9QTCgAPeKZr+k6hiBdsBO6sxPZWx3rhAQe+es9SHhghaTe669U5OP3WO7dYW7e0O\nk3unqA0PU86q3PnGn+KGN92GLR3QAmSKMeqlpFG7Wr3f22YJCgWBES5bzR7VcgnPFVRKDsX8FIur\nG/Q6La7GCZPj4+RtiRQpBkjTiCSOQFiUS3n8OEELm6GhUVqdJnHQJ/RbQIEozVhbm8N2LfqDHlI6\nqCxGKZvBoIHf7zDQDhY+Y5MHaPW2ELkcubxHmnqYLEHjQE5S1nlyZY9B0t3ZrtkpmbLpDRLUtMdI\nuYrK5Wn2VjFRRGvhCmVL0zcptfoQSQROscb2peexbYVXHCPqNCnkp+hFTTyrhFPJc+7yBcrpRdoB\n7J3eQ9JrMggD9ux5I/NrF7F0hG3lEN5OBU87uxkpC/1lsDIQw/QrZzlefDNb0TNkSYn25uNIsYfW\n+hJB7tto49FLXsAfuESqgeePsLW9zNqm4OCeLq899uM8+fzT1Ouv5cr653BH2mwPAuruDE/NPYzl\n7XvVkL1mybLZWUWWDflCTDd+Gnswxbn1v2KuN0+OvahaynZLkh+ZRgvJULiPWf8cW+tbdGWDoOXi\nJCn90KLVbJDFglIpZHOuA3mDu9pFxQE9wl12TclFt+DJy88zNVnn1M0H6Wx1GCibvCzwrn/0m0wd\n3odg55qJNgk6zXaEv1NNp9On1erQbjcoFgscOXIUIcxLZxmbUmFnet7zY7RW5DwLJQzFYplB3ydL\nemxvOUxNDyESFz/qMOi2SNIU5WYYnTIYxKQm5ezpR3buh8Uhke9z6ORRZi89wx/9239HFPd3aGI5\nF52l9P0maZphScXS/AbH95VIdUqlVEPlKxC0Gd0zQxK0cFwPmVekTomi47C9tQom4eDMFMnfrXB5\neYs9owWmD1yP43mMqSGaeonK6AhJP8Ip1NlcvYQnBSJ2qI7UaXe32VxeQgcBfhSjRI2O9KExoJ6v\nsO/mt7F8+js0+32GKkWsep3V5iJJqtgzcSOdcAsGNlk6YGjo8C6fBUJTcAr0WhtM1G9krfMM3X5A\nURZxlUMvbSByBUzq0s1CVM9haEzgVQ6RUaTbtjk5cSOHJ49xevlh7jzxTtzCMGl3npuOvYHntv+K\n/nbGePEgdXcG+P7aYdeusiQx5QmX7e2IxnrGaL6NH45Qzk2hQ59+2CMbeFjeAgfH7uWFxv1MD3bU\nowAAIABJREFUVU4QZuep6nFarQ6OPcP/zdybB2l2ned9v3P3e7996X16m+meFTODbQASFEiRlEST\n1EJqM0uOdnmJ7cTlJK5yUq4UlUq5HMd2FFmVKCrLKi1eYkkURQlcRBEECRAgZjDADGaf6el9/fbt\n7svJHz0EMSzBkcNUwW9VV3d9XX1Pf9+57z3v8rzPo2stTD2iE9qcsT8I5Zepj41REM/QCFt0th4W\nHosNFceqILMOm3sNOqMB0+N1pi0Vxz7P9PL8W4l7mibEcUYUJgghsB2DSqVItVpBiHkEkl5/wPra\nNsvHj2HZGopQyDkmihIx8g5ZEy0jpVYy6Q8dBt0hIuvjDko4dkqcqHieh23lUIVKPxiwvnKHwaBF\ntVhh5A1p9vaYmDlLs3Ofr3zlj2k3u6QZGJqJmmXESUgcJyAVTFPlT79wkbP/9ccQWkCq5BGDAcLM\nEUuDQZih5iqkYZ80zmj7fZzcBGHUBsMkiiO6HRgNfFZeu8LYsTlkHGA7k6zfu4G0SoitHUxL0h1G\njM04RJ5P0SyijVWJgi6yt41MIlI/plgucXvzPteu/zM+8aM/S2cU4QYBRUMlDPoUS5PEWZs0SrBz\nKk7xJH3v4QdcRT/B7caXmB97L+2dfQbS44n5j7DXvEgQLjM93YGCid9POVowaSX7SFmmUBqjkpbZ\nHxvxvulPcj/4DPaOyt3y54iiOWRxwNqtr5MbP0U8dp0l6zz32tfe8Zb9y3Adz3Ko+jXOoXzEb0gp\nf/W7lcrTFZOdrW1yhs2RsbOMlUJGXodyco5moUmj8waOYlOwixzsvMxC7SzpoMnQz2MVfAw1QmeE\nWrQYpGuU848ykvuoRQXXHHDr3r/iwtxPklu+AJffeGvdpNhAYmPqZXI1Hy/OeP3WKhX1NP/w13+R\nKArQNf3ww9EMTFNB5tK3wi/XDQ+RxIaCoSuUSgXOnjtJtzfkzt0dTp5cxjRULFNFVTVGXoChWqiK\npFzI4Q40PH/AxtZ9xscnGPS6h1mLchiOfe1rf0itNEujeYft7ZQkzZiaPMagv8Ht21d4/suvEacp\nqqqAqhFlMUkSk8QJtl1A0fJ8/Bf/EYivozNPe2cPRY2Qag2z2KK/3yH0PTQnRxwOMU2TYdwhIcX3\nJLYKo9DD1qewbcnB/ZsUx8cJwxvIzKSYJXhFn3LhOGgtGt0BpXKN9s4uw1GPkd/FG8U4eRtFlxz0\nXOYn86jj43z5xa9xfvEU1vQc/cynSpVgNERkJo5l45TreJ0Qx3q4kdxt9+j5HjMi5PyJD3N99wXu\ndF4lb+YQco9RLDm/cIy9wR5lo8KR5AwH3l2G7S6pmTEReVze+acMdQU9nOdEOo2hVVAtg76d0Onv\nEYYxN/Wb7O39haIPfzlnAWLg70sprwgh8sBlIcSXgZ/nu5DKq88NOel8BLd3g8z2iAINqVg0tRUM\nx6Y2micRJnsbe1hUsNUVDpKQwaCFpdQxzDp5JUdpbJ6OdxenYEJ+B22txO7ObUQpzyv7f4ga1h56\nMwu5D9NQL6GKEJnplHIGk+UKM89+jL3GLll1hlJBoqsmCIiiGCkzVDVDVQ1yOQNQCMKEfi9ACkGx\nYFKrFigVT3Dp0hssnDjJWNHE0DUKjkEYxNg5jbypoxgm6aiL5/UZ9huUa9OIRCNIIobdNoHvsjO6\nhRuPEImgVK2zs32bZnuPz/7h10lSUISKFApJliJkiueNyOeLaLogCTy296/w/gtLbG/cQ6KgRRLP\nXcG971KdnkYxYtLIRcQpPbeHFBFWrkwchSzUbW4cBOwNYo4t1FhauoA/2sONY8w0IdUdcuQ4aG1i\nWDmSZERzex9LScjZAoFBUdE46PQxSzlMTWF1s0su57Ewe4S7jXuYu6tc+L4fwhtK+t0tZheWCSOB\nlaS47j6mWn9oz4Sp8ZjzEby9Dncbt9DqBseVC1zvfANhxJzIPUa/XaUmVeYmfhDdXeF24xq7UZvJ\nbMB6usYSp2i7MScLC6xsrlCbTNkKb+P1brNkf4LdoUteL7DXelhW8T/JWaSU+8D+g59HQohbD5zg\nu5LKSyPJ/KKkMa7hj/ps38545NQz3Lh6mb1en7pxDL0qKKgJUSEhlCoz6iL1sTxeEFJ2aoyUVQQe\naWGfnrcOZYFSfoa8UUYmdYLynzPq5B56P6/deZ7FQpl4IEizkJzUUYo1tu98k1LpCAYWleIRRn6I\nqR926JNUkmUSVU1QhAZCYlsqjl0AHpSJgwTdUHj66cd58WuXiE4soYgEUxXUx8YIfB/dNKlXx4nD\niCRtE4QjothF0TTSKOH61VcfjCxHCN1GdwRX33yVP//iJRTVJAwkqvYAXyZBERIvcLHtPIqioUiD\nH/mFX6J37wqNY+dwDJ1MUfDcmLGpCtVkHM3IEyQuqqajChNv0CTJQkajPmqaUKvriIbP9dshvnuF\nUmtITokpmjliRWNr903qpSrDUZtiaZyCNY41vUxj7Q6qAbpVY3/QIEXlYGcfp1rFshMsfYLXXr/K\ney48Tity6KyuUzm5wO3Pf4PJI3VmT38vQXONyWNn+MYX/uihPTPslKAd4igTkBPEMuZi6yVOjr2f\ngXuRvdYeS0cWEG6dK1tfhkBSKcyzVFhkXf8THtM+gtwziYcv0pr5HD+1/Jv81lf+NpOVZezyx9gc\nrGKKPN3hiOGoCjyMJ/yW/SeVjh/otDwGvMp/XCpv+21/9hdK5T158jQN7S5ZrDGZL7F8wqBQuc7Y\nYsYPv+/HOX5ujnDX45ELU9QqQxzdIAgaqGaHJ5Y+zs7Om9jRCSx7lrJRJcJm87rFvn9AV95i0L9G\na7dMMXtYJjoSr7MTdNGFjiYskiCgPDkGmPQ6DQ7aewCoiiCKJUkqEEJj4B2WVP0wIssEoDzIbRKi\nOMEPE7o9nziBD3zoCb754leRwuTzz/0H3rx6DdM0EUjqJYPxyVmmZk+wsHAWx86TpSGp79FtbtLr\ntgnSgOeff45//r/+Ll/+4msIYeH5PVRVRVUMMqGg6QpJGqFph2KvQghS4MyZMrvtTVY3m5h5h0Kh\nyMTMEYzyGJOnTpNZDlkqGfU6uIGHmq9RrEygChOz6PD8KysMekO8FCx1krKmUq8voWgGUeZTKB/h\n6BMfY3rmHE55FqVYZ/3Om+wO1lnb2aOxdQ9bjcGU5AoVCCWNg5iBf0CxMs5Lr75KxXT5gz/+Lez2\nkBurdzgycZxw6yqN9X3i7g7X+w/nDTmKUNGoH7WJitdw9ArnnVNEW2+SZir5nM/l/d/l9vBL3Om9\nQRA1CL0erXBEY6VOYzSkJ1Z55EINXX2adblJdTFPrIckacipsXNk0iUeCfKjh6mz3m5/6QT/QQj2\nh8Dfk1IOvwO+8Z8sldcdZVTF+2iXP4cRnsG2z3Dx+i2Etcpo9auoikHh7GUu3z3CeMVCcxSG8R7F\n3FHeWPk95s779FY2+MFH/yZ/cv8+JWFh5+s0eveZnjiHsjjESZscbD8sE+3MjBF2r6A0PoRQYjRp\nEWQmqvAZuntYmiCMHyPvmPQH/oNGYnooShqDH4zIUolhKBTzNjnHwjRVhAp+5HPQGlCt5JlfXubm\nrStUx6d57dJLtLstHr/wPeQNSck2yBkamVCJ/CJxFNOK9rizcpfn/uRrOE4Fy7QwNI1UKiRJiKo4\nCJEhkWiaQuhHRFGAkyuQIlFRObZ0mp1797BzdV6/co/3nDtO34sxLZskcLnxjZdJVZ0s8YgCD3oh\nUepTrkzR6rdIfUGxaDHqpAzdEd1gDJo9eoGLDIboqoNqFfnSZ38HwxxiShMnV6ZYHaeYL9B1ezT3\nAuJBiOeGqDkwrSKEAZ2BR8kAL4KrN+9y+txpvvDCcyyeWuLqGy/Q2evTSlqod1RapzR489t7ZlmT\n1DKLbkNn4JrYNZM15S5GPcSRdYRxmjDqM17yyPdzxLFHYubpBLvMH1lm6LdZ696hrxU5WzpP2LqM\n4s6xb7zB+eqz3B98hnJSRp/f4VDI7rtwFiGE/sBRfldK+dkHL39XUnm/989uYTsbqKnk1DNdJs/u\n0I6vcXzB4sblm9RZxIouUFQLTDhrSCej7jxKrp5j6K1TKqpk1SH3Rr/G2dKPMWAbf3qD8cajbIxe\nI1/Okfdn6A0frqzU9HN07TV8EeGYGkqogIjAMAjDHl5UZWdrk1OnTmKZBjIDTdMoagpGLAhChSRN\n8YMI1+2Sy1lMjJWwdQVLN7FVg7YXYhUL6F4fpzrN2to1Lr+2TxBHnD33FPl8CRSFKPbIVEmxMsWf\nv/gnfPPlu9SqMyRxhBeGGJqGADIpURTlAUwmwXcDyCROPodAQUUjI+Vnf/5HuPSVf0d14iQvPfdZ\n/s7f/CR7t25StIuMBm2SIKUwUQfdxKzXkAH0+4KdtSuUnQpdMeLEYoG7u2381OXuQZEnJnqoWgHL\nLqPnbAyZcf7s06TRgFZ7E98N6WyuoSmQxT5CZgRKSmpKgqEgiFyKVo6Ngx72hETXVBp7bWxVst9o\nM390ijff+DrNoE+c6rTPxST3Hg6dr/d/A9Obx08DZsQ8W/1V8qU7hN4cWTii1b+MYoUkpQVc53mS\n4QKxZzMchniDdUqliKkjc1j7R3it/TzT4zMU8+M4/QKXNv4V+u4CN69uEMuAyNv8/+4s4vAI+U3g\nppTyV972q88BPwv8Lw++f/Ztr/9bIcS/4DD8WgYufud1/4d//gh+6yTxxB/iDjJaa1UemalTrU5T\nXfo+Rv0+G3sb6E6MkMvgZ5ycfxI9MzgwNwnabbzyJit7c7x/HhIp2G71GfhrhIlktD7kVKHMwjEN\n3oaN29m6dSgh4ai4vRA9lAReD/QSvjciLIbcWbnMyVPL2LZOGCZkWYSum1hZitAVFFsjtgz80CMM\nUvYbA+oVG4SCYUrqqkG3ISnka+wfbNBq7aGpJnduvEqvd8Ds4jlyuTytzhb3Vq7wO//639DaaRMn\nCYqigQBd05CZJE4i1Ad0SQCDwQBdM7CdHBIVJKgqPPHk+7l1/ZsEgSBPzP7A5803rqKmGZrIqM/N\nIJwCWgidnkfkeXiRxChWOeo4BLFLNIIjkydZGHuRvY5Pa3+PvWKFMStGqDHD4QhNCFK5QZIZGLaB\n5iiMW5PEcUCnr4EeM+wMsRQNaUrCUURQNcCO2Ok2mKpViH2fnjckSAx2tnrEqoc4ajB7/hGi0Soi\nn8DbBNv0LE/OKTNdi9ht7VESMwQ9HSOr0opvMDPxDH5o4XZ3GIgKk/JJPLFKqXqUkA2iQUSSCWJr\nh4pyFn/QwTYz3MEYo3aEKG/zyA+aqMYUUoRc+dJf3MX/y5ws7wP+C+BNIcS3arD/Pd+lVJ6bvobu\njPHKjRrHjx2wMP8sm+4mS4P/mcLRMr/21V8gr4yTmwyozryHvb2MF67/EXMT87xv+Ye5v/UmGga6\nLlmN/pQLj5+n+41xFGkx8ppM1HTuxX/OaO3hp1TePooX9tjoNZkROZRMJ+655Gbr+I0BcT0lUDyu\nXbvFubOnME2DND1M7TRDAyUhSTNyloJt5ggtSX/g0uz6TNVL7LdHWOah6leURKzcf4M0zWgPd2h2\n2kSpy/3VK9y7t8FrF6+RRApxEiMlKIr2wCkEMjsMucSDEyWKIsIwxDQNTNMBJJqikKaSp57+ABee\nWWR35QZ20WHUWAVV4fc/8wJ/42d+nFwpD06Bfr/DqLWL58UokcAu5mj0VjANhySK6HVbRMmI49MF\n7rRaqL2EzR2TacdA2C5qAqGZoSlF3P6QwWBIXrdoeDfQi0UKuTyOM0ni38EdBWi2g7AFSRiQN2x8\nXzLqp4RZggh1EkLE4izylMakVmZ3+AbvO/ZhLjW++NCeVTmFxhyt5jfQs2nKxjF8bZv13Rs4pTH8\nsM2+v4ITW5j5OUL1HiM/4bzusSZsKuVl9pq3QcnRH6yQs6u0k1sUCuM8ff6n+Nrqb2I759gfvUHJ\nnOCd7C9TDXuJdy4EfN9f9KKU8h8D//g/dl3Fr2MXt3hv+efYGHwJOz1K2E+4mvw+3v4aRW0CRy8y\n6m+y0tghIiIUI+xBgaujb1KZHrFknmWhUOPu/Q3u7F9i8UKe9PoBiZ7HiCcYbNxn/sg0rH2bAMHv\ntpB2HzE5htaxCLUBysgmJ/UH/RUPKce4s/oaY+NTTE1U0HWVLDsEP+r6odpXloFuSFQBajlHZ+DS\nHXpMVHN0hwGabSP9gH60zslnz/A//d1fxc4VgeukqSCMRiRJgpIJpEwBAQ/4wR58hiRJQhzHyAdh\nmGFah2BDxUAhJUNBUSRff+k55hd/AtfrMnnkJDdee4l6pchue4ReLHL3/k2iUZ+yXWboD9Eth+rM\nNFmqcPzoMbIAWo0NhJpRKleRep6/4l3myys+q60e9bEyY0GEpipEPUl51sbMSezyFO3dLQy7RuR1\n2ekm1Cd1TMckFjnSwEWXFsNBhF20kJlHiCR1TEJA/Z5xwvEuT57467x0/X/HyTncGN2gmD0CbxMV\nyufnSCMPzTNRGee1O1/k6aWfYLLQBTPD97ewkjlsy2baOcJ26xaGmudm5KNrKlvpKzjlOdQ0TzE3\nQ8dbIdhWKFV9Lt3+I/w4IsptUS8auJ2HoTZvt3eNRf8/bD1BAZ1bO0U6/T6lXIxQJXv3M/KJzuKp\nOarJY1xtfRMtbpMAHz3/j3jx/r+mkC+RMwSDZgPr+H2s3Y/x+JkTfPXNL3Jta521rRvUx2zWN1Oq\nCzGf2fl23vJL1lNI2yNv1ylvmsSxQEsjxs49Q9jzsIt18oVxyAJq5TkeO/0MU1NlpBQPhrYO51Yy\nJEJ+a05FkiSSoRuQszV877AyNnOkxn/129+H4SnohsYf/8p9yuUCgRfhBx4yFaQiASmJowikQvYA\n8p8kyWGuIgSaqqEIFWHo5EwLmaXE8lAfhixlOOqRpQlJIhEiQUHwofc/gzpo8PEfepIJw2Vmfgl9\n8gTS9Wm3tul1M4a9XTQB3UGEXVAomA5BHDAYDtlv79Fsebx444AstfnR90+hZn06niSKR4gElJxC\npT5Fv7NLSc0xEimdZoOKU8UwTXZaHWaqBdaaPQp5m5ErEGZM5YkS8XGPcvAsbfcqrt5lrHiCySM3\nuPW6g25n/PbOt/nefnb8FCkqQdihVDzPUb3K7e7rTEy8B99/gUxZ5PGpD3I//H0iJSTyurhDMLIx\nEjFJ5m9iFsqcOfJhUs/mtbV/jxAJtpDsDVJqjo3QC0TpECX1+d2/v/afF4t+45bF1cY9yrNTzMgl\nZo6uMq//A665m3zj9m/S+NoW5YVNPvrRD7Az2KQWvofRaMBBc5Nms8q55WV6cp2VV9rU7c9zpfMc\nhDmOmT9AcXqSa81VhN+hGEwA3xY0mqufZSd8kTTV0U2HLPNIfUE29NANncjtE9l5bMXB89psNDZJ\nFYWxSh7TUBHi8BQ4ZKg8dBQhFHQdapU8SZJQKlt0XRfTkPzg8qd44fZnCBz4np+e4dqfeWRuQJJk\npElClqZvsU8qyuFpoqrqIa3rgxNF1w+Rz4ZpkcgMVWhAhhSQZskhJkzVkDJACI0MeP6lVxivF6l/\n4SKf+msfY+gVuPvVF7GihChrkyg600eOks8ZTKERuCO6vS6GYTE7UcMs1hHyNg4ufdXjyxd3ObsQ\nM1kukBTGSfw+ritpbGygmzbdLMGwVQwlR8t1cVKQcUQ3TLADFc9MEKUCz/6NZ7l0+yL93YSnH1nA\nj0Fkc1zbfonN2ERLJObw4Qd4pljEMqKcO4qKxoZxkbEjBqrnE3uTBM7rPL/7BmPlHJrRpnVwhFjv\nslw9R6/dRsu9h4H7OruNVxhFA8aMGTbdW4hcgaXaSda7byDcA/L5SQr6BPAwMeO37F1zlhl+Emv5\nCzQ3+yT2KkG7ijj1eT7xvl/lwpNj7DbXeWPjee5cbOGLjJ5ymZutl+h7IY/M1nl95038QQWZKly5\ntkV9XiE/0FEnb/HqlTfIWxUm5DinCt8L7V97a92DwT7rvRbvm3s//WgfM4vx3ISdG6+z8NSHGPQH\naOEAJ1clCQP6nV0qhSqSlNFgQKlYplIqYFkaqsiQUnnbQJdEVVXSLOTY3CRJEtPpeTT7I3AHdDKX\n2rEyB0NJ+iC8ejuCOU1TVFV96zrf+kIeElkcxsJvI+EjoT/qoesqcZJiWTaKphLHEVKqDL2EdpSi\n1kpsX7vIVMUhX5nDsJcwtDztdo/2/ib7zS6FagnFjenGXbIsYRAMEXrA9z6R47OvRXTdmFavSBJ0\nEFmbyeWj2LQYDAWjoYtqOgR9n1QKUPXDAS7NwvVT8rM1ji28l9WlO1y5uUWvqzFUJd+4+g36mk+t\n8gbJQGW59iOsOd/ECx8OhYzafcKdZarFM9y6+hInzpfYarvkCiskIiIczaOqPbb6MZosYeY7VKoB\nL1/5AkdqUxDdY8xcQPbHyfwWPd+jUs6TRhGGTJirz9F1E2TgM1E/wXf0z9+yd22eZU28wurqDXpa\ng4++78fJG+f4jf/zeX7ljz5InG2TRhMMuhYzlVO0OyusH2wjotMcrXyA5tAn7CeouyZjzDA/v0BZ\ne4L8+Dx76+v4Tfj+s59ibO4UL1366kPrutF95qOz7OxtE1QlqhTgxUBI4+5VlDREZhmZSEmVlG5v\nj3ZnH8eyUVSN/eY+d9dWuHVvldure+w1+wy9mINWj8HIJ8skUhqsbu9yf32f+807NEcBe34fuwDF\nCyGP/cwYqAKRfWuADFRFRdf0t5qMuq6/ReOqaSqGqZOmCYo4zHE0JFEUo/DtMQFFPQzLFCSagCTL\nMDD50z/4Y46fPsfyUz+DapRo7m9w995Vbt+9QtgecfrcBZYXTlKYHKderlEbqzJdm0NqCv2Ry6mp\nFFVTuLXbxRV5Iseit7NHbxiTZSZRrCEMSYJJLDLSSOKlEUbOJlNNjPc7RMZdCAVOqY2V05ipTuEz\nInWbbN2L6aVNNja/jjGS5H3noT27e6nCVDZDHO2wcPIM9xsjkA6jfhdbHSMMDkiCCnXtFI2Wx24L\n0sEEi2OLjOlzyLDIvd0tdht36cYRBUXDdwVKOkfX3GNq6jxjxiJ6WeUgfOex4nfNWe6uvkl36BC4\nBp/7xn2uX1/jXO2TfPDUP+BPX32Rl28/RyPc4sXbn0WJltBFAUeOMQj32dlu49gVBsVdjs58GNuq\nI8QW3bV97ILGR578ARTLo5rP8eHv/dRD656af4bS0jTTZxYJRZdE1cmcQ32VqNVAyogsTdFliqkV\nkaQ0mpt4nsvikUkqxTIiEySpT5x4DIZ9eoMBidTYb3e5u7bBsDsiCQWpknJt5wVU64B8Drr9ferB\nLOVcnY/8vROkWoKqqKCaoCmHBBiqinhAGq7r+uHYsqYSJzEIyGSCogoymeAOB4ekFkmGTDOSOOSQ\nOEMlSVPSMMMuK7gdl/GZ0zz/3L9kZ/UaipSMj0/y9BNPUTv7JI3GAdeuvEKr0WDY7xIOMjRhMFdb\n5MSpZY6Ol3CUhExReG3FI40UumlIGmXEMkM3JbELXuKTxBqpBVOLT2N94DiP/rUP0xj5dCv3aDY2\nsLNFUtUiCWyydETRrrA0M0PeHsOtrtEqrHOgbj+0ZzPOPJ20i59ESGvITOkMqoiJE4tmr0uSCVQv\no+FfoW4sog8Fo1FMomfstzcwgpilsXn66j5eH5rWPdRchfxsQKaFhHFKIJpEniS13rkpqX7605/+\n/98T/l/sl3/5lz998mMq/r7AU1ziwQ63+tfohDvcHGxgUuZ+5yrSV5BujZHdZ9jbpRPcxVDGMJKI\nrrrNTG2B9e27uGkfwzTRK3lyWol8JeNg1EYRJrc2v8LV7NvVsMp2B+lkqHoH19eQ+JiuiZNTCdwQ\nb7CPWSpj5ipY9qEQURB7pInkyMw85aJDIgVRmJIm8SF5txTohomhacRRSDfoMVmr8pmv/Q67gzWS\nSOc9j32c+cppRukqgT9OwVT5oZ9/lr66SXpwqHyMqh/OqCgKGaAIQZpINN0gyw7zkkxmSJkxGPQR\ngFAESfIg70GSxIf/k1APiwo151A0aX3tTS489QTFQoWBO6TZaLC7s4vX3qRWrlOrVDhMglLCos2o\nP6DbHqCZGUIzUKVLs5siMtjuhIQjBcsWCJGimpJYVVAUE1VR+LGP/RT1ow4HYZu7vefRwjHcpiRn\nT6EGBZx8DmMYYJYcJvVJ4myfgddjFEqEZ+OjsJL039qzjy5coB20yKROlLZIAsEo28eyTOq1CVRZ\nQcliDtohxtBimFgYSUrXGxKlLoka048Ei+OPk2UJQjqMF5bY3X+NUuFR9jc32OvtEomEilrg0nMr\nfPrTn/7l77xv37WTZXJSpzY/j+cOMecDpufyTB1RiTshJnXMzMEd9ZhQ5rm/chnD1ilqE3hJSqPT\nphn02RlsHU7DaSZBKom0NVZ2V1iTa6x19lnvvc5Tj//CQ+vaizW0SZdePyawWiQlSaa5RCkIVYVU\nYbC+wsjtE/gjhGqiINhvrbK+sYMAJsfK1KrjqKbFIZDyUABH6AaqZiMTwXajyU988Bd4YuqvMDaf\nEGcJt27cpBcpdPrXiOIcly5dYXF5nI/8dIHFD+XJFw6FXIUAU9cPycDzuUNyiwzS7NuVMh7kSWly\nCPJUFAVNVdE1FU07LHUHYUjedlAthYPtEapZ5tbtVXAjji4d48wjZ5haOMHKxiY3N1bodQ6oT81Q\nyBQWFxY5dnKOOBDIVKE+luPUEcgXBFmqs+8mXNuQ3N0SHHSg04TNXY+f+tTfRVFV6k6Ftr/HlLeM\nEkSUJmZxpCSkRxD1GciQ1YNN7kerDLOAWNXJRRMYsowejz20Z63mfZJQAgki8TAth8nS4+SMKdzh\nPkGyT9kqUC/lyU/Z5J2AkIwo0qhZs7R8DRH36LdvcPL0R/g7P/6riNSloiwTpx082UZJdJR4SC9+\nZ8KKd81ZxoxZBsF1/tYzv8HUgkrSPcKpR54isrvstF4lbKtoRgHRC1ksn8eOZhh0fJZJdjZ9AAAg\nAElEQVQmHmF8aoF8ush47mlOOp/EJyNOQwyxgF7Lg5xCN0FEOV5f+/pD63YPIrKwycbwLtEoYxSb\nqI4g6x/C3ZUMktEA32+gagmFfBHdcBCZZGXtdVqdAbqmUi5a1MoVLKeEqqkM+gMUBFYxh5WroysW\nvZ7Lz33iv8WMF7j/+k3svEMw6qIrGnlS7GkLPy4jZt/L6fPHeebHjrP0lIKu6mRJTOAHkCWo32Je\negCxG/QHSClxXZ8gCA5PHU15SN9FUQ5hOrEmCVxJpAj+/W//XxxfGKc0McHd27e58cYlGnsbnD6+\nyJljJynWJrlx+yadzgFXLr+MpTssLR9HNyws3WHu6CTzEwpLMwq13GEZfeDHNDoGH//4j/FLP/PT\n+L5LvlhFhiE1p4Q1Pc2wKuj4XRLHwC+0CYMRtqOikNLZG7A3CLBEHj/JUZ6cYq76cGPQyzSOz55h\nNlckkXn0Xpco6hKm+yixRRKOGMYjHCcPWR89Z1Mo1hnTbdxkjVlHJ18p8lT9B7l++6vs7N1HWA5C\ngBL6mE4Vq2IThhVQHoZHvd3eNWe5fWufEwsZN5r/gvHOswTudS6/3OZUcYkb99sYFR0tneK+cZXJ\n6nFi2SequTSaW2S9BpYaEvoW7fyblFUD6SrsDtZwrAKj/hBF75GbsFDVh8uQwcinPyjimBn5fEyx\nlBCUVWTcRmYpMk1QNRV3fQMvcPGiAEW1QBH03Q43V27gegmWIchbFoVcDlUz0TQFXdEYz1tMFfMs\nHpnkzPFpxisW/+Mnf50js1WU4j2OTj/DsYWjeMUB66tXqNUOaDe+yHb/Mm6pifl4xLFzk/RHfapj\nddI0xg8PHUJISb/XI44C4viweSmEeIvN8lDl+LByZug6Qkp6bR+ZqoSeT7tlsdPtc/3aZY4tzXPm\niQu4gcsrF1/ipVcu4oYpj54/i6ZPkGSCz33+9+m095k7MkMcB8xPLZFXJWdOTfH4iTIXTmt8+D2z\nfPKjFwiHbUq5MnGikiFRE+jFQ1baN6lIB2GkjKI9clmdKJUMUp+aalKqOGRZxn77gJxQ6Q0auOnD\nw19aaY37O1/lWusqcQCOXSdIDyjGOaLEJ0gUerFHueAgFQVTN5DhPqkZks9NUCjNY6ZVXu5/BS1L\nuLP1MkF3k1hNMa0lKnEZGQUcnztB7KbveM++e5ITQuP1iwGXNod85eptppOT1IrTRLLGE8dmyUU2\ncZJSGK+z1r1Eq9MjjV3WvStkWg5ZsFjb+z3uNJ6nud1ga2OEIwqsbtxiZnqOLJgh7Vbph288tK5d\nGTEpZqmq7+N7Hvll4iRPeeJxYkPFtIpEUiWMU+J2m26nSRwEaJqOpucQaUa322Rjewuhaji2jmMq\n5KwcmqrRdftEiSRXEFi2ymAUEfuCybFx9sI2waBAEnXZGLwOmssjJz5EEh2jXD5GouZRlJiptbNk\nYsD4xDjeaEgcR+iqgmGYNBoNPN8/nF3RVGzbxrattxj54TDtUBSFJEnIlxwOItDVjOEoRWgmn/nM\n8yyfXGZ3P+TenSuousOnfuJv86GPfj/D3oAbb97gyEKJRx/7AItT86xcv4WQOsePP0acpkxMVFia\nO8nCkTmW544yOV6lVpvCzhUYDQaYImLUP2DH6+EOR5QUm0pax5DbOF6JtBlx5GiesjmHMDVUAabI\nMVOdodXyyPQBneHDuNtRSyfUJAXdRrF99pUmMi2w7naJkGhqCTXUaA5bSGMCx5xFVefwo4xCOcda\n7zKRNmToSTIz4erWn7Hp7hD7JfY6d2hFe/hZjN9LUJJ3lpx415zFDRtM1D6GruqQzdDTFNaa17h8\n51UGA5W23kLLp/SGKd6oiG84EPlMGjMUx2cI3B7YOXLJoyTTTU49fo7pqWc4/+j7GcQJhaJGbfII\njvMwW8f2tke37SNFk+dv/FNif0Qru4Yw8sRhH0dAFAa4+LibN+n0t0llgmmU0Kwcfthia/cOW9tN\nFEVSyBuUSxY528E2bFwv4aAXsdv16QcZOyOPTjdif28dTTiUrFPY8VG2bm4R7W7RvrPB+vBN6nod\npaOwublCfx9kEpK3JGmS0W53abWah+EVD0SOkhSQCCVDKDzo8wgURcPOOUxPVpiqFpEyYNsLKeqS\nxPNA5ljb6KHLNqfPPcFw1Off/t4/YW/1Fkvzs0wtLXH18hXiJODCe3+A/NQMuzvr5PNVOns7LBw7\nhyIkan6MfK7MeH0OS9MpFk3iICKTCbpWYG+zg9/XUKwyB/oqH3j2b7Fv3WLHSxEsEUW76Gad8dxR\nirk8e26bybqCFRZQ9YfpqzIjo6xUUK0q5eJ70AwNQ6ZUK2OUMou8opBzFIp6jTFTMowO6It1kkhn\nd3uXxHNoNYZsdTfpuBEyMcjncogkwg8CTNVECWz2Rtu4UfiO9+y7pylpzJGqgnrhLNLZQtR91PwO\nxVJCqISobgnLddjcv8/skUdQZEAipsm0GnvDVQwRoYk8abBOIVkmHG4RxCNkqiM6LTrBTW7sfB1D\ne1h9dvl0meJUnvmJZzH6dU7NfoSdRhM7PSSAcAVomUQmksHODlIkZEGIamjUxmYx9Qqd3j73NlZo\ntH1AJWfq1CsO1byBY0LeVMhpgnrBZK5iUyvanHcWmKguE7SvQ+YyKSbYujiicz1k9EeTtL+mMrpa\nwOvm2N/YQdU08pUaCwtljs5Pc+rUMmP1Oo7jYJomjuOQyzmUSxXOn3ucj3/8h/mrP/lJTh+fZqKk\n41g6WZZiaypuZrCJilItoZWK/Mmrt6Ho8MJXvsTpk4/wAz/+c+w0m2zvrRB5Q+YXT3P7xiU6gwbn\nH3mMKAlQhIJpOJSsCmkQoYRtUmEeEm0okCURhVoOqWcMRx0u37+Plgo6+7vkyxF/+sKvk9PzTCwG\nWHqKnkvwYpe5hSf5vqd/joruoRZTeqMheuXhGSQ5LNDqDej6N1nf+SyaTCg4OUhG9JIuqd4iLkZ4\nfsz9zhZmajDYVzFsUIwytawOsUbOTAgin74Y4bZM9qMmo7bH9vYBumqgiyIRfd7J3jVs2H/3Bx/A\nl31GgxGOlkfJeWzvbpO5JY6I41BIqYTHuTN8jqhQZhg2UPolVEMwO1Gjak3T8Ty8sEMgdxivn8dL\nNkkaDk5hATHRQulbJPKA39q7+tbaP1V7LyO3RzCIOPvID7G/cYsoSDk29OgPYiZNGzdI8VIfRxg4\nc8tMH32S8emj5OwShm3Q2ttAypS5+XPMTkwzMVbiW531b4Eev8WGL5FEcUi31+OFF7/M7//OP8EN\ndQxbIUkdOo0WrbaLZWmAThzHaLpKlhzCaBQlY3xsktn5BQq5PLliDlWVZMEId9jjxIknCFOdg4Nd\nvvn1z6MqKnF2iEj2fB9F11BkilWwqVfHyTKXJMwQMuEf/uIneP3N6xi6z9zCozQ7fbZW3qBaX+LI\n8lEuvvw8z154Pyvbq6R+iGHpYNrs3N8gNQU5p8Ls7DwyjIl0QbUyTjxqYtoFvti9hFE2SbJNdDGD\n1NfZHzbwXJsxZ5w4g17YpmBmjDvnGAx2iXUdL9hBNQT/99u0QD8k5yiYefrBgJwTYWRlTOeQfUdR\nLHp+i7o9hudG9NMUJVUp6xNg7qNmOUJXodnvUhorkzegrzTAs0kCh5rQMI0i7YMtSgtnqegl/o//\n5t/8hdiwd1GfZYfhEKqDY6w2twkCk4J9FA2HgXFAagd07HVUS6WmqeS8OWYni0zMtQiaJvvKCnop\nIicEpco4w9EBBClG3kUf3yXqtFHsffY63/GkUELyap3p+WNELZdOcpepwgSN0gi3NcCPJfm8iSZ0\nvDjBu3eLBJe9/Xvcvv0C66vXsAoFVDNHf9hgY3+Di1eu0+4NHpCEZ29BWADSKMP3JOO1On/1h36C\n0oxFkiRsrnfYW9/B91NmZqZBGtSqNZaPz7OwOMejj52hXq9w6vRxxsZruKMGJB7BoEu7ecDa/TWC\nMGKvvUersQJKQhAfQvrTKMb1fYIoQQgV3SrieiloAlVoaLqBapT4jc98geOLY0zOHOf1l79K0bE4\nee4x9hu30byIpx55jOtXvs50vU5/2GFseo6oNwIb8pZNmibYhkWS6OhJhjfsoGgGa+5VjFIb12+x\nP3TZ2HqJJBiSpQVMRWF/eEDf3mSyWkNm46jSY7z2GIgEVeQxlfJDWzbuTJClPkGYQVQmTB3ioEC7\nEyPCMopUGR97hkrtMd73+I9SyKvgNInckDBpUp2vs3xkGpF5KHqI35MYoojBEN9M6ZWuE01EqPaA\ndrzBO9m75izRQZnyaBlbNZAdgS4gzbqM4iF6JgnDhGHWxDDHWet2sTMdXUq8XpHceJucmifZGxCE\nKVkakiURjX2fRG2RV/O4oz4HLQ7HZ99mSpZD1fNk/YBucpO//ol/Se1knfLUNE7BxI1GZFKlbFvo\njkqkKXR29w6hJ0qeTnuNve17xOGA9sEOMpEkImFjd587azu0+z5BGBAnMWmagAbFgomiKKDaqO5R\ndvd8JsbGOXbsCONTk9TGbZ548gyTsxNYtoOQEAUJY/U6uuoQBB7N/R22G5usr91n/fYKrh/S8zJG\nYcbli69y8ZWvYqgKvdGQgReQaAroCoqmk8vbWIpEM0FqJqphEsmYbmDwtaurRN0+j33wg9y6/gqa\nCovLx7j4zT+jPDFBbJXw/YR8LoeFpNnewTELKDI9RAmkLkk2QNEcknSIDAJ6dhFbP0+m9Bh2++Qr\nc/jo1EdHOGedJpUCK5giDhyMNGKvvc7d7VcY+CFCzZGJh6thPXdEkhpYjoIY2liZgdSLTFbOINQE\n1IydwRfotNZo965BWCYbKbSTIW5qsj14nbS0RaVq449s4q7DzvY+ouCQmCqPzH4/jy18FNXXycz/\nDEvHO94228nXkEGEn7p0DjKScJHKxDyeMBCBySBQsZQJlufHWTy7RJDLKBbzDHE46AzIxh08J0KE\n02hOgmYWGI0s+n2PqfFzOLqPkzyMM7JUC5KQ4pjNcABf/eb/xjcv/jukG5DPG7TaHq1WA8NQKD3Q\nPTm4cxHXO0AzVUynzqh/wN7ODQb9HVIy4jghCAZ0Bw22d7ZZ2Txga6/DVnPEdtNjrx+wN/BxvZiZ\nxRPk8wZSpMRZShSMcPse7eYGWdSnWimgqpJRv0m7u8XQawKSUrmM1x/Q8/q4WUSQ+WxvrPLGyy+R\nK0+QhpJEs+l2Pbwow1BtklRlMPAYjfqYeYPF5RP4YYCqWRxdmuP4sWVk+QT5qSm2793j1Jmz3L70\nGscXH0MW8ly9dpHHzp7l/v3rFOsT9F2fWCYszE2BDkk4JAkCdEulVDZRM4tY9bi9d5fAE2jWGZYW\nH0fJ5lk76OJaQy4OGtTcWXJ9g6w9YHDgYxQex4wCilqJwI0IvqMvOFkfJ4g9HCHQp8GeKTNZtRir\nT3EQ7hC2FLptqJVMVAw+dOETzC++l/nSLI5tY3aPkA4sVoMbHF0qcvr0JI+dWkaVQ0yrw+3bL1M2\nJjl9/CNE8QrvZO9eNazpIMMlbvgb7K6ptNubZP1d1NEOC7llEplS0BxSPcKPbfZ27+C5McGwTBZE\nFCKLZKBSMKqkRkyWuUzVLAr5HF5/j8BtESUC8R1inpocYldjOv46v/Bfxpw4s81MTjDwRiQyJklS\nuiOP3YM+ikiplsYwkhivfUCaJZTsCk55CjQD7/9h7k1jZEvP+77f2beqU2tXdXf1dm/f/d7ZF24a\nciiKEkWJWqw4VgAnthEkiODEgBwEcQIkgRPBCUIHsZEgtgRHgmTDlkiJWhgqpEdch5x9vXfuvvTe\n1bUvZ9/z4Y7nTjOkJFsJxs+nrlMH9bx93vc5z/s+y/+fROzceRvfGeE4DppmIssqRZExm4zRVI16\nxUQWRIRMRlIlLpy7gKmZ+GGE4/ikaYLrRYwnIeP+EXs7twncCVPHIQ0ihEIgiyOiNKUoBKIoxHM9\nZjOH2cRh//CQ/f0tBqMxN+7cJUwzTFMjjhN0TaLdMhjP5mRRRpGkPPX0U1iajIFynyg2jPjezRlR\nDkfdHk88+2m+840/4EMPPcqNN1+jiCKmgzFrS0tcufomgiwhaiprKyeRZQFR1hnP5gwHQ4J5xG/f\n+xZIAePxPQazl9juv4Wfb7NpPErfO0SVIuJawogeeSZTWvQRnVeJSgKitoOT3iMvHX+795w7yLJJ\nUOjEbp/5rE9/NuVg9D0WqjJqWbgffBB1pDznq1f+RwaDN8kVF0118dxD4qTF2aVN7u7v4Rczlmqb\nSPqIVfsjlKwGotblha1fR1aPv1zfLx+YsZw8r3CxdR5Z9TjdWqdtPQxah0iQmeQDMhTCbEYY+5RS\nBVOvsVJeRRF0KlIdVW3js0dWxHjTiMnAYBpMGMUpltqiP5mgFxLN9nGXfm9vh+WOCLLAP/3nI964\nYZCY59kw2yALtNtlesOM/nzEwWCEmCeU7TLTq9eZxzOCxKdut5GpIcgiTjBFUkzm7gjPGXPr5muM\nR2NSsWAy6nP77j3COKEg5e7BLoPphCANkJAJgpg0jqjVNFRNJEkh8V3SNKZUtpBkiWA+JopnRIFH\nEAdQiAwHc0REkCQkReeoP2UeZJw+scTKWhvTNAjjOQu2cr9+qmJycDDDmcwQZYHN8xfJBJ2555Ij\nQZbSKzrozRMMBiPqCx1cb8rp85dwZnPUqkHvaIDnBbQbDVI/QJJUJE2jEFLCIMJQ60ylA8RqmbOb\nj1Bfl4nSgBP1Cwh5xDtHryPHbYaHcybTEVEMO9kQ6dwOwvkdzKUtKM0om8uUxeOt4KP9gCSdEg7n\n9HoZO4d7jNw9vHTKYCBilJYxNYOD/DaallIvbVIIAUNvBhQUpWVEISY5aqNrDRAnXJ39MZqxzOFk\nhyCDq7tvousBjpfww+QD62cRpRp3/JcRqxZVXUDKCubuXWoLNmMnRJEUNKWMF7uQRgSyi+ZLGIXJ\nLBfebY6yaRglImmIoqjkoYIiO0wEB93IaNun2cuuHtN74rEWr127ysnGh5llQyLP55mnQra+oSJm\nFkYtx9BD7DLMnAzPP2S93aZSNggOtpHPrpBlBe3OBtNxlzSM8Ocj0tAhzWV002I8ustgqlAp1Uiz\nkK3bL7PYOUsaxfzeF38dVZTJsphavUQSZziOz3zmY5YM9GqTMJghCQqiqJGGKRI6SRLjOSEIAkka\n44chiqIwnc6RZAnb1qiWbQaDEXalxHgCe4czZHmKJMnIWsLB7g6t1jKFEdNeW8Jz6kwHPdSKQRFH\n3OkpWNmER8+e4Zvf+zJPPfoh3nzjVTbOnGc47ZHj059MyGUDQx5DmlKQsbK0ztjd4orpoxfw8htv\ns9yyMJ02d7zbqLJJvVVBTIaonoollHAjD0lOOXz+ETJjj6WVElXzQyys5XR3jlcdi4WMrtY5/+Eh\nYqBxszvEKgUkvool6uiBQbW2SEU5jT0ycJUpAzGgaS4RxTZnWw360wGWWhA4Ad00ZkNd4IAAs5hS\nz20G+RFqIbNUOgscD12/N47/v4zhzxLP2SeKXbRYQBFiMrWPalSIMos0H+BmfYokgkxGNGKsosUk\nGNPNjxCShETYp1au08tdOo1NdM1D0HTK6hpxGpOHGX6xjx58X0bWNdAEm5l7QJ5GWJrI5VttrI6I\nBAiZxuaawaAXUK1KKFKJOwddZr5HdLTFYLSFG84QEZEkDUVTmTo9BFGgP7nH0uoZskRAEmJC32dl\n6SKybNDv3eOLX/o1ZEWiVqujKDq+fz8ca1kW1WoVVRZwZiP8mU8Y+SQpyJpMlEUg5MiqREFOrV5l\nOg44OOxjmBqaJlKp3TeoRt0iie/nRWr1CrKqMRjPmE5zxmOPtEiIg5TAm2JXNFZOrlOurRMGKUmW\nkuuneasvcHblJJdv3GRhuYMsqeztb1EpWzTsEkLqEycRlVIZdzpn7nS5Zh2Sa/cQJZdzJzbwY4iF\nApkKA6+PHwc4aYGiqYzmLpNhgIWDO/Fpc4Y8LHNh9RmErIwTH49g5rFK3VpnNjAZhBJPXfwZKuZD\nzFwBXavgCQ79Yp+7zl121V3mfkrk50h6A5I5/fwOeX7A0A/pjmYInsHYd0gGY6YTj+5wjJ51UGpr\nBNLxncj75QMzFqNSotqMcaIpfc/hcHLEgqnh9YfEsoYTO4iJxWbnItkoJRRdauoGZlFFs5cQszJe\n4NPULfqTO5h6m4ZZoGguhDqa2SAJGuj68XB5GCest8+jCw3kQqe67LFYb+LMVfIEBEUk16BWqzEa\nO1SrAlbJZrc3wJ8EjC5/l8HRPSbTXUTZICHHcUf0e3uMersMu4dU20uknkOSeExmh1Rba2hGi4P9\nLaQix/ccppMpWZzjuS6zqYvrugRuSp5mGLqOHId4rsvUmTOb+vh+iOPGBAEUucjiYodarUrZtjhx\naonQT3FDn1S8X8pvlRWyOKVkiFw4t8L58x0cxyHLs3dRYyTC0EORFRQtZfPch1hc3sRqmti2TbL0\nDHJlg0LI2drZJVdkkkxEVMvImoChW3i+j58U/Mzf+Hss1jqU0wW2RzET38WwYuRSzsrSKmJeIDka\nC+UznD3/BAurEs2WwcmNn6PaimmfXkDKTvHWva9RYNHoHKecOPFIm9S4zoX1/4x4ntEdjAnzOYu1\nVao0sK0YgQmFOmE6m0IOpm4zGt/EjaaMJ12O5iIHvT1iL8FQTabTiKrSoSXYVNpl3HiIGMT4/zbW\nhi2UL5GHLWS9QRZpCFGJu6MeeaGhJAUNsQPWkJ57mV4eUc3rZFlOFosI8ohMmCOmMwI3JRECZvM5\naXIfh3hRXkWaJiS6S1k4cUyvmGQUrkqqO2ycrOKPbW7ee4lYKdA0HSXNUMkpl3M6zQ12ticsVTU2\nVxaYhS7T0Yj5ndfZ3blJqW4iiSJhENBYXib2HC7f+hrLyydpLV8AQSIJE1rVBufPPkLQHWJXS6iG\nwkLdJs8jXCckz0E3debOnDRKcbyETNbQFBkRFUmGwI/RNBGEhCiJEYQMVRGoVysEXoKQp6RBTjjL\nCZKQ6XiOZhTs7M4hSYkDn4984uMUyX1gDFHQEBEJA5+iSEAdY5R0TENHUQ0KMWdh42EWHv2rnP3k\nL7F64acpNc/eh32axnhZiVNP/DVSa5Hrl1/iEyd+miFbrOklRHyyKKaiVdCUlPXVDdbX2iTJDUbz\n21QX96ktWQiNb6O35ownCWE2I45cJsMdzPR4rkPiCDeUuL33T6mpjzIZenTvjRDkDok0w5AaJJKI\n7D3ExcWn2O0NqZfOY8WrVK02DU6iujIMI0xRZNZ3aQgtvDykMGUOu9exJA0hz9DkH56k/8DOLN3D\ntyg1YqK+TOC6REbKmrRGf5qgLkQEwzFpIGGf36Ej6/R2ZgiaRLNWJQoESgokRchoNEKyLKbuPs7M\np1VeI4i28VOfFVnncHi8TfRoNOQXf/K/41t3fo3ueA+t5fHJD13g+W9cxVbPgiTy+X/41vHBfvv7\nE1XXgOeA/50fLP/TD7x6n3fjxg/87s8jDz90gpXOEpPhDM9zqTWqZEVMxZJIfBlR01Hepe2zNAPP\ni9AMkf444tIjTzLqj7HtjftIMYKAKGmoqoahGghZSqO1xnjSRc5jClFAQAFBobywTnPjPFKao8ky\noiKiKib9gysML4eIhog/8JG8jKous7b8UV6+8RwtQUHApLPYZJy+ytmlMtf3rrAQ2gTmkKPhhP5R\nTrNeUF7bw8+HuL0z9L/PWJyZRq0tYeVrVJtn2L/2R9jNJln0DsNCJtjXyaQIR7rDpdYEDYX93gv0\nUhnb1xB9iVrZQMgy8lRBNiTKzRLzw5xETVG0s3hpQOFpFOG/hcaSlg7uE/nEYCsncFOH2W6KUolQ\nVY2ocYCctdFGzyAg0GiJFGlBb/YW64tPcPfmANXOscrnybIRpxtn2B/1GUcuy+UFCm2LnrtPq7kG\n78M/eOxHfF5/7Vf56I+XeLLzY3z5nRf58hfe4uzmSRREEu+H40Z90GKbFq4zp1K1mHoxaRLiuxmi\npGOUVNx5RGWpxrA/ZDr1KAqFzlKJztpjrG20CLwQ13UptUoA5LmAbhoU5EiyxmyyxeLCWSbTA4RU\nJM1jiD3EPEEXZey6jaGpqKqKKApkE4uNtVMcbHchKvj7f/V7vHr7T3jpytdoaDaL8inONx/hfOcc\n287rvB7/BgumxuFYZtEqgGcoki0m/hbFbY/aZoVCSZDD41uhxdWTRMGAwJ9wsPdttDLMZz6tTKaw\nVZYbZ/CKAf2dLpe3rvDkEx/j1t0dzi4abHdHyH5OXhGxSjaCYdLSq9zbH9NsLIA3p+tMaNQt3Mxn\nubbxQ5//B2YsNWmBQlQ4UV5DbhS43TJPPv5jvLz7VRQ5YlF5iO3dbawFlXqzxNtvvk2nXcNK2oSz\ngJWFk5SMMjvuAaVCYhIb1JZ1wn4ZpaozuanT2mjhZcffUr/4of+W57QvcuNqxulWzCgu+Eu/+BTb\n13pIUk4sqO/d+1/8rR/FkAtyT0M58tmWRYpin7fe8QkKgZ/9659jqb3K6tJZ2otriFmMbdcplytk\nWcZ4PKFebyGJ0K7XGM0TXv7ffgU/GHNwsM8f7t5hs1xDQCDMUwpVI4pzsixjHvuIyFQqJb78x98D\noFI2seqLjIZTbCHFtCp4Xsh04kKWkiYCk9EUP0qp1Croeomf+OzPkKcpCDlJLFOICZpVwS7dbwnI\n84z7bTAZAiKzcMjC6gXC8T5x7JPkGdHsCFUQSWSJkmEgawpFKjId9njiiafo9bv0+wfsb91AGeb8\nWOezqGoZQbZora0jpClPrXyOJ4p/h277Mv/wT/5z5rFPIrzM4voCddEkkUsoSZnWsogvJ+/H2CMW\nZ8RSBJkNZZFFbZkqI5IoRIwrDMY7EBVU6lXcO2WST2wzy1Jsy2CpViItmdTVCkIpZHH1kwwP7nLu\nRMrAcTEqNmc1hVhUaDQMRn8K1vEHdmZ5tPMMFe0Cd7pv0VGaiEsCB7ff4FTtIr357MYAACAASURB\nVNlcxosDSorF/s3XGCWHtC0bMdFwBeh273E42WfujyhnKp4+YnjYI3FVomRIzxlw4tRJSkqNYHic\nGOeLr/wzTq7D+snH+fL1uzx9YolLjTKfeugvkyg57wf8t03tPleknjKpSCyUfGzVomULVASRr/7q\nHyEUCv3hDnleMBj1CFNIYoGFRouNlQ6u41Kp1EizDEVLmc5GZLnC8kKLjy0sMQtCDqYT/DRlMHMZ\nzqZMwgBZUrFtizx+UGdWtRuoQKViYpgqURzgeTMiP0bIQbNUkjRmdaVOtVrlJz79k5BBkufIpsWp\n0+e5eOFhahUTRRHI84wCkCUVCplciEnDOYE3p7Z2EcNewrZbCKJCEs2ZzLoMBockUUy/e4dpmNAd\ndBn09kicEXeuv07ZrLN86nEkYwm1VGM03Ae5YD6aMZ3tU81W+Luf+gL/0ZN/n5XSBgtEzIMQWUyI\nw5THT32Canr8nEmiMt6XGCczKqrJYbpPlMmEWszRbIKeGAjoeKFDeaFG96qJKCVMxh5YPTJ7xD1e\n5igf8Ob4j3j6iU9Trq6iN+c0qzUqRoXqgoAv9/A43h7wfvlTPYsgCDrwbUADVOAPi6L4r/6iFHkA\n37n8JoZZxaxXudm7RpD4FB2VySwhzwJyp0wkGTRWbRjnxPYeUmBgCSaVtYscDW8z8uaUmnWmgz3O\nLT/CYdzDqkuYqsg4uMnplc9RNipw98EZxDrc5MCOKMYull0hCxu47kc59N+hCF00/UEGV1UFoiCj\nP/ARtQpiGiGoEvW6ShSB70WEfoButigKjdOnPsLMPWQ0ifDdOefObmKXy1Dk93Gp4ozSyhLu4R55\nDpfWV3mj20PIFZJMokhTClFAU1UMAwxJQzEfRPMUuaDQZGplBYgYTSIso4xIRF4UNOt1JCEjiBPq\n1QaSaoIAkdtjMtxhKKksn1jBLll4rofn+0RRSKvVQTcMJNFAEnJid4/YtFg5eZEiiYkjnyQLkCQF\npUiZT2f8D3/nl/jEJ3+c1ZVlxrtbZFJKs7wMsch4f4pY+LhTn7nT55tf+W0evvQwoqhSabZQ9Qon\nF87ydO0Zvjr5A2wtI8occtHjudd/A71wjq0VXRF4aOMSbl4wdO/S0J4A5RbekU61ZpGHEdOBT9uW\nsOoS3cMxmRRBtgbJIXlPY3GpgydMMKUO337+1xiJE9YWLxKnOX51gGxqRF6KqSo/1B7+VM9SFEUI\nfLIoikeBh4FPCoLwI9xn+HquKIozwNff/cz3UeR9Bvg/BEH4gToa4jm2RlcJE49O6RTn1p9h+60d\ndKNE7ouIUkrTMoiFMd34No3sYaoLjzMNdeaTKX6WUbYLcmGOopjsc4QolHGCgFHsoKbrBEGX/dnl\nY3rNUyrTocmTm8+S755kcCjwz775T+j1XQI8BPHBw6ooJpaooStVAi9n6OSkQUIeSOhFTjVPmU77\n7Gxf4803vkKWhUiCTanSxEsDrty4xtQJmHkJWQa6obP2iZ8gEyQSUSE3bZ7aWCBWIfBiUinHNk1a\nzRrtpQ1OXrqALD8wFtNWSNKYOMlJCpE8S4nTFN00WO5UCeOQIPbpdYfcuH0NVc2p16qsnXyEi5c+\nyubZx5kMPQY9hyJXsIw6JzbOE0URaZKQJAmSqCDkBbPhNYa9HaxyhYXlDsvLJ2k02tjNZcyyzX/w\nN36Za9eu8rXf+y1EFUqqSpoUBEmE6xzgeH3Gk7u4sy5SEvGtr32JYf8u9y6/hixGTLq7PLX2OU6N\nLqGaCzjZlLKscXLTh1A6NmdKJjGeRwznW8hZmYVghh+FtFptVGWOlEZUVjM+8tRfR/Z1GuYay+Yj\n5KQc7ixTqZ9ClKroxiKasAhljUqnxiQ4JEuPyP2cqSMgZDqJ8P+qzP/zGcu7BvOvynZV7oNSTbhP\nkfeb717/TeDn3v37PYq8oii2uY+b+vQP+l3HvIYkxggFvNj9E/b2X6K+cpobd28RZRLdvT6Hzj0I\ndS6qv8CgP+DW9hVOd9YYhBPONR8jzhzyWCbTp8iNeyiNCRQycljFV7bpjd/mZO24S+92PebjZX7/\ntX/Oq3u3iPwIW4T0tosqVLmx94Bu4OUrO9w6dLlz74DefMSLdzJu9kImQo5fgGiWUCWDi+c/jmGW\n+Zd/8jssLtbYaNe5cOoinaU18kIEEVQFwkzANG0W6ws0220WbBsLjaeX10iSnHqtztKJDrqhoCtQ\niCKdExfeG09cSJi6gTOds7Z6gac/9Gkss8LSQoM4yJiMxoRRztJSg+XFJt/51nMEnkMSTBEKUC2F\nsxcucur0BXSziqKXKCQVXTdQJB1JFEnSiFzIoJCZ9W5zsPsOEmBbFnbZxjA0tm68ThCP6PX2MXQJ\nQxIQM4moMBmOByDKJFmG0+ty/e1XSQudhc4pXrt2DWe6z5XXvk2a5cwdj5//2N9mdM8lPNKYuT5b\nWzIn2z96bM5iF/qTq6xtHFDaeB5PKFgunybzepwoP4xeVoiSmDev/C6xOMZPBBANonCGLBsMopso\ngU2eVMmKPqEoIbsGRabj5H3yskEc7XMw2iKRevww+fPws4jAG8Am8I+KorgqCMKfRpH3fuzLH0iR\nB9DfjlDLJwgmUz7z9H/CwewmiVBQW6wzTxyktEwRxiQHGrNLO0ykIwRJ5dq911CNgL57gKAtEScH\nGO5F4giO5l0unG/Q7U8oxW2EROB69vIxvVKm4xfbJPMRmSLx4v6bxMEm7b0JYhgy2gveu3dz4yxR\nsM9Lr9xm6GWopsltX+eJUxp3J31WNYmdb32Npz72MzQaixRCzne//bucPv0MS0tt8n91BEoKKmUD\nVYHnf/3zrNVaFM4Y10/QpAKdGZ956jw9IcJPY8TUYzxy2d3d56GHzz14mN0+VVul1TnHJz/zE6hy\nnWc//bMomk734BBRyHj9zW/ypd/+EvWSzkd/9Gk2Tz9CGrvEScqkf4/b996itrhJtWKyvHyJbm+P\n5aXzOP6A++jNBVmRIyMgySLz8Tb37uScPPUwsqgQxB67197CiTJmozlSqUTgzKmvnCaPR+QUdPcn\n5LgMvAC1tgBIDAa7ZOMJxfIJQhesRp3u7Tewm4v8yud+g88/93c4GN1jbVHl/37rD6H0YM7SyGS1\nU6VUnZKO2+x6tzCiBjO5QPaOELAwc58s0elt9al1ROY9gXNLD/Gme5kiXIaqgevcIRY0pFzlKD1g\no1HjdjfHEAbU1BZGc5GKYfHDyl3+PJQTOfCoIAgV4GuCIHzy+77/16bIA7jz4gRJm7LeOM1z3/4D\nxHZIyTYhl4j9ELNsIeUm0mbGNL6GKgs0lCojKcQQFfKiQFVsmKQcDfdpbOisN5aJgpT1codm9QyT\n2S3G4WPAA76Pd27s0W4LLFU69HvPo+fLnDNEJF3BUivM5w+iIWmwjZRK1BdbME/ZubfPwkINZwwn\nTtY4uj1icPWQlRe+xMef/cvImcDKqY8AGaPREMuyEeWCqlnmf/2Vv8WaImEYZe51D2iWGxSqSKFI\nqGaJdNqlU11CVMpItU3+0i/9MjN3ynQ+h//5twAw9TrT6YRnP/UQKiUMrYSgSMRRysrGOnmictEb\n8NTnP4ymG4jq/epjQ68Qp3NG0wmS3GaxvYYgq7hxysrGY0zmu1Try6RRgCBk5GlCniWIoowkFviz\nLsOjBcrVEt29A3qHQ771yncwpYzVM+cpGWVqRgm/CGgvrzF3XEbjlKeefIRvffMrBMEYTVIw20uo\nlRoz3+er/9dXWKtIUNgoVp+/9ti/zz94+fOslT9O69wuL+x/+b15qJ9McIU9Ej3m8FULNVIJbZHQ\nSXhHep0Lqx9DLRRGkynECUeHFeqLXV7f3WW53KFdNThyJxj5JjO3iyz12aid42Bwh7XlFgdXdrm1\nPSVOBaS/iGd5b8UXxUwQhK8AT/AXpMgD+PBPXuQg26dtBfTmMfE8phsFrLVPYJhLePEBprzE9d5b\nrDc2KfI5QqnNgjIhDBxSISS6fZ1qq83q+iK+fJOp71AenGQqjUiFGSPHISM4pteu5EwGKe/c/A7l\nkoQkeYjDEFuvkUoCQuXBmWXYH+EIBZuLMisrj1Gv1yEcUKuCmyg0NltsdR0iz+fw8HVq1WUqUpvc\n0lhYXOewe5ff/Hv/NR1JZLW5iqkZJPmEoijQJYnCsFBnJn4UoCoVyGNKZZPA7fG95/6Qcx/9OPX6\nAzjR//K//wdIcYAgyux1D8kQKOIUVRWRFRVRF7n46KfY3n4LRS5DHpELAggJ1XqVpz/x84i6DXlG\nMJ8QRj6pN2d16SJOMKTRXCT0pySpi5AnZFkMSMiShO8NiYqU1771h7xz4zKjwZRHOzK97buoJx5i\nOD+gVm0REFJfWkYr13j71W9QlBd5/fJNTrSWKDcMQGL3yjVOPfJhjvavIGo6ar9EY7nOZx/6LC8e\nfJ0d9/ibfXvvLcZ7Kg9f/PdY/tCLpAdrmLWMG6/MaVQ+yeHV66wt38d/KxYsXDfC62WcWbxElg84\nku/R23c5tXEWw1pAURKc4QQjsUgcidIanHxIYRDPEFOVy8e5lN6TPysa1gTSoiimgiAYwKeBv8tf\nkCIPIEkyFsw2CBIQUyopKErEcDbGO3IgCQkWjtAEiSQFEpXD67eol+rYKxt0+7uIdYmptI+a6giK\nSZQWtJQUtbLKwfwOlZJC4h1H65hNR1TkGiuNNbxojJ2VmQ59YiklDQM+9sgqfO9+biYIMvKsQBZN\nLizFdG8fceH8/bKb+cGMyQSUPOT//MdfpKH/LhdXqxRqjq0ILNpNbKvO6XIZu7JARVcRVRAzjVIp\nZx4EWEmKoshkXopVNpBVlSKJ0ESdw7de47FnfxJVecCeq4oiKApiAUvLKzjTCYKYUCQCiNJ9+nHN\nZPPMo/SODhEECauQgQjp3UrnIp6hm1Vku0JTb6OKMpZt89ofPcdTTz+LXV+iSENysaCIIqLEJ0li\nNMtGAq69fQMvnDKZjRhXF1leKmEZJpqkYFkGmqxRqzQoGRHP+3Nev/wC5578OFfefp6fvfQsV25e\nQ9EEzEaD+cgkLQqmox6iEnPLfYWVxjkWFzp85e0H+THfsdlc7OCOurizDMMc40iXOfXYjyMFEXfE\ndQLFopKOSAqNIBjizAsm1QHlqkIyUlm9YOHLNwh2bcI8ZeP8BUJvjlSZke5JjByV2C0wUPlh8md5\nliXgN989t4jcJ2D9+rt0ef/GFHkAM8ZokYifKhRxiFRpIg5D8iBDEyETVfzJhHazSZAXhLOYhVYH\nUY/pju/SbCyRZhFCPiXVVEInwHQNuqVDFrKIR+pPchReJa6ovN+5xLFGJBTMYpHu4QHuJGFcCJyO\nGoiSjuM/aEPuOSXWLR3HzxjcOORsdZGDywPqG1VEWSWIJtSrZVp2gSmLlC2FcsnALCROLW8g6xaS\nKmKKIlkcE4oKpipR+CmCanJn521yXcRUbZLYwzAtKmaVYB6iCDGlko3nP0CnKRBA1BEIMQWBUBTB\nqCIig5gjFRKCKCArsL6+yf7BLrKsUQgyiqDd92i6hYiAIBRIYoag68yDOUKeMukf0upsoGgqkqSQ\nSxpEMrWaCYrOi//yd5H1MVk/QZFypl6IXbUQ5BxNNXHdMdXKOeLUQRRl9o6OUMvrLJ1UubDwWa69\n8zaHg4yVjomtq0RmA8nP8aN9OuvLtHubvJG/Qvp9iDxnO48wiQ4ocYhea9KUTrO175K0jvAjjc3z\na0wHA6zF82y0T3Kr9xyTHYFGtU2zvsn17I+ICpm6dAJxZRvDTTga3sASA4rdVRRdRQlCNjd+irfv\nvfBvZixFUVwBHv8B18f8BSjyALy5iydJZLFMxVLxZ1NG4wAhDylLFZYWbHKpjFFYaErG6ccu0t3d\nISoc1ior2CsrDEZ7OKHHYL9HbbEGeoYaSIiOzPWDVyktSfje8XLvslliPB1TCCKPP/oZ0qBLf2tK\nkkkoQo71PsyqlXZCf5KhxtCVMiw7Y+/GmCgI6JZMamWNNI1xI4EzG1U2lhaxSiXieUDsu+iKgS5Z\nhLmPLRvohYTcaJGNR8hqQqRriIWMrEBRiFgimLKMoEr4qcDwaEC1WXtvPIokkiKiyvfL8NMshyhB\nknKKIkRWq0h5QR7nhMmQ1ZV1hoMRiqqTpTm6VqArCiCQZgmybKDKGqIgc/7CUxwdbqNbJqVqA0MG\nzSwjIpEWESQ+r77yCkjLGNU+68IJBMchjGNUSUJQBAy9hhv2qIh14iLmiU/+u5zf6PDHX/wn3Brn\njMYei60SFatFJubUGg0KZDRNJcoLPrr+U1zb2mN8cP3YnE3DPnWtSU5M4hlcHbzFytomclIhkX0c\nd48it1CiHu+8cYe5fYR+0mc6OsRKz9EQV5BFn6E3JpMTFMPELFxkpY0o9ZiGYyq6xbX9Fzi1dJ6v\nc/sHrtkPLIN/sbOJXlepVxVkTUEWWpwoneLSIy00WwBLZBxsMRETRCXkYLBDqkaUZBs3jZiOA+4e\nvYASKJRLDdIsQPYEKvYSUcVke9pHdOv81NpfOaY3PKxxdCvA0hVKZoW9w12qikqWRUydMUL+wA0H\nbkrJjLEXM6qtFFUWOPPIEodRTPVd1JSqXSbLYuq2TrlkYmkazXYJpaISiwlTfwAxuEHATAjYvvEG\nXuCTJ3BiZY26baBpKgIFYeATuXOKLEDMEqajQ3hf3L+gQLqPr4+smiiyRsmwEHMJUVSBHIEUUSwQ\nJYi8EUIypChAkUHXSmiagWaYqLqJJEEaJ6iSzJkLj5EkAXHgkyYhQTRn0LvH2D2gKGQOrn2Hxx86\nT3//Bu3FJj/7cz+PboqoCORZgTsakEYhUq4RBA7zoUs2dXHnAd3dOdkspCQJdDrnWTq5wc7r3yWJ\nM3JAFlO84QC5ViHLDsm12rE5M0slEqZIRoOaLbHQEZhO9knFIbm2hxsPSaU7DASf8smTnF9+BmO2\niGKLCPHbHLhDxkWfU/UnCeMpjrpHriQYRpOICbXRAjvdkLIO94O7P1g+MGNxPZeyFqIay+SugtoY\no21uM8j7tNYXySSB08vnsSpDskJCtQQUy0crSSyvPMHtWy/Qys6glUwEI8TwK+glDXeQI4wiFqwF\nMtFnheP93PJil1YzJVEDBs4+mlXhifoqo/GcNBPY6w7fu9fzHHRJoRBylLRAFmSqpoLVLnM0GhJE\nMf48xFZVZDLiPMSfehRJgSpWmPpTHM9j6k3YP9rm9uU3iUUJRZKQhPt4xQe9Lr3+LnEeUYgwno+Y\nzUdEiYsoSSTxgz2koShE/pQij5HFHEFKmTh7TEe3GR3dZDa9S5SFUIgkcUKcBwTzQwxdQxRzJFFF\n0wwkSUZSJBAUCrL7aPyCgqooZEDqB2iagVAIWKKNLBSk8YzV9Qq5qqMIMrPRPv/h3/xlnChgNNhD\n1Qz8rCCKHNRCodZqMx532b5zF1VrMA88nvnML9BYaHPn9W+iSDKSIWHUBaRcINUUZr0udtIhmx4/\nZyaFxzwpcMI7XLt3mSIoY+g6Uz/ED11SIcEym1iqjDINOOptoRsy3qjMIBoj4ZAnOW8evEhDPEc2\nb1CkBp5/QOJr/Mjpv4KVaBS5Taodb2k+tnb+v1n6//pi12sMJofUFA+z7RMk0JtIiLrHvBiTCgnj\nSKZdrfO5p/5TvnL5CxxMB/iegzd4iXKtTuAmCM6EaC5TaGOW7XWW1jMIKuy8MuISNi89/yIsPtA7\niQ+JLZti6DDfucHpSwrRKGZ1uUWU5OjCg2jY8soCfhShipCLIogZSZKxXlc5mInoeUzbLhHHHrmc\n4DhztFRElGrEmo9UqKRxQK4ImHoJ1TSRhYJMgiTP0FSF02cf4o3Lr+IHMxwzJPcjkrxAVUU25ZAg\nnL83Htd37iO/eCN0VUMoBCqVFQLZwXcnBK6PJMzRmgaKYpITM3MHdKQMSbCgSJAwKIBMkBEREWWZ\nuMiQYo9qY5EiFyiAtMixrDIxOa++9FsM7h5ytDehYTWZz3tUDJ3f//0v0JQDPFNl984Wpeoi1skW\nbuFRtRpUyg1EsUDSCj7y7I8y3HuTUX+HhXIHw65BFqDnZYyFGqkoIYkqp0sn8SoZeA9aGWaTHuVy\nmzjKMS2FWX7I3HOpG0t4UUoeRYxKI2yhhiN5GJmCpS6y2EzZ8W6gVktUU5Nqa4KfFawJj9Gf9hDE\nELv6YbrzQ5Y3NxHTiDQ8XmrzfvnAPIu9YNIs15GEGL0ic3DNI5zGaFiEkYMpi6i6RJOTvPbC79Dt\n7rBiXAIhIzncQ40ziiJgzVpntV2nXlrjcDQgUd9g7u3TXAq4Oxoyf9w9rldbZblZRhTq/M3/+H/h\nFI+wfeeIq9f2uHZrj1tbDzzLZOohk5NLAoJ0n7fRcxL8MOVMq44/T+nNXGoljeX6EpKogGZh6DZe\n7OOl3n1OSKlAkWUSx8OPQvI4hTxFTgSW9DZOs8VRJtIdBQyShHmU8XbX4eDOawxGD7YF7ryHLCpQ\nJMRJxHx2ROpOSZI5i8vrbJ55DEHWSIIIQYRCKAickJtXv0WlXkdTDZI8BlFAFIT7lAsUkGTEgUPZ\nriCSI0kyceAhSip2c5H+tSts741Y3FhivVPDiSvoFRtdNxGkCE2UcCZjgvEEQUhI04jB4C5n1ioM\ndo+oVHSm+99h371O5uaYCw1sy0bTqyRCgaCoqKIKooJs1Ij1ybE5S6KEg/0u866H4+locxU9LZDi\nEooscbbzKcQ4ZjCa4Hv7TMMBB8OruN6AhWKBmqgTIODNSmhhDWd6l1rJQ1ZVTGFGlExoim3Kuc2C\ndo4fJh+YZ3nj2ndYbK4TMsNLepx4Ygn3ts3o9oDGioauREiGyu3wKn44w5AVRkEXZ1IQVMus25DN\n4NbgNpIiolgKhlzl7VdVNOGQem0Tq9Zguf0IHD6Ab9VShSDzMSsuv/OF/4bOQGU8i1B1jWrZ5D7N\n6X3ABEuXSLLsPn2DCEQCCTmBL/HjT67ypfk9pllKWSvRWXmY+a13SByHg8glUySikY/cbCAWCkIW\nEuYZRppT0k32Dnus1GMsVWOl2uYAGE2nqG5IGgU88fRFDo+2ScMH/TXbN1+i3FinVi8jSwo5Gvfu\nXeHCQ5/gaO86gTdAN+qIdhsyB000kAyNwJvy1su/T7u5jr6whm3WyESRgpwsChFFGSGLsKs10jAG\nSQJksjzmu//i89RWLrK1/xbbd/rotsaSNWfu+Dz/6mucXzYxSwvU22VSb8545mLJLovrj2LXBOr2\nDeKkwD75LNnWLlonQY4DSvYqummTZhOG3S306gLefMbXul8gzY/XZzXMU1SWmwyHPe50r6DWy0SC\njVZ0mXkOw2KHyNcoGQZpYZEXE5w0x0xF2o0OkpoxDXfIRIEgDEhT8H2ZVkMhElLy6hGmt4Eituh5\nxys+3i8fmGcJu20cxwBpAVmOyHIRR4rRSip+MUdsOkReQqHKGEUbRTQJ4gFilLG6ZiDKOpJo0Fpa\nIZSmqIFCmhRUrWXWao+jhQaz8ZBXXj2eYRKVCgUZgrvCktQgCHxUUjQdKpaGLD8ALEhyUCUFQ1JI\n0pQgz0mSgqzwKdk2z17oIIQZFT1m5/B1mo06K2sn6KydwdRsGq0OVmWBLEop6watxQXMkoGIyrwQ\nuLV1h1RN+ImHPoStKNRME9E2GM4iVk930MSA7tGDyEwS++ze/TrDw1vMBgf0d99E1zIGw7u0V85S\nrq8iIhD5hxBnpHnE4uIZ4ixjNhkwHB5xePc1urtX8b0eCBlFlpEVKXlWYCgGiq6jSCAICUbJJIlc\nTCkkFWUkMSQJpryz5TN1Uxbbq2wd+hzu7zKaDojyjNlwjKLX6Q1vMx+PeerDz/LURz5Lx6iyudyg\nXa+y2G5iKCXEwsSorFOtt+kd9Jj0t/FEBU84vhXyAo/J7Ag37bOhnyMSPBQEyCxSNyWTPfIiQJQD\nzq48jqmWkFSVvJKhaia7o+vkooCXZ6h6TFxIeE7CvcGYoDdHHq5iGRp6GHAye+iHrtkPzLOc+fBZ\nCu0G1eAUE2mZqdvjmYc/xeWjt7DSdYYHt1GEEKYJeWCycmaF7N4Y37jFfJYiiQYnL53hYHeATRN0\nkyzJMeSMWB+ixSaJVkCc328WeFdUeYpmVEmtglZcoi86KDUBTRYwTZn3pVlQ392qBFlCLimkUYCf\nZkiZxmR4l43maf72L6wwC6ekRUKvv4UVqeSyhmTKxKnLyPdo2cuIRYDvR+imTdm2EeYSb+4E1G9c\nR5RVTm9s8OKbN5HzjE6nzGy4S9lcwMsfDN73h4ReSP/eFXxRZf3kk3QPDrn1zps89pGfZnnlBIPp\nASW1jjs/RFFr2LU2rXCN7tE9+qNtLLNEHse4cwPTrlOrbZBGIQUxOSaaaiHLBWmU8C9+/VeZ7l5j\n7gnYZoNqrY5SqvDxT51lGgZ0mhWC7CPsX/0G5Clh1MN3C/qHAoViMNXG1K1FyCJOnz1DFq3QvfcO\nZx7/NFtvfAdvdhm13SYRZa5vv8xjj3+EVmAT5AYE3ff+771792gtL1KqZYySGXEu0DGaXCrVeSUx\ncUOXU50PcTTdY/vadylKUFIrxEnAgfAmiiJyEB1i5ivMsjllvYxVsnHlPuW2TffObWbeIlnukvWP\nV3y8Xz4wYyniEaXSJgNvj6jboN5s8b0bl3nodJP+EZBoNNMmvhxSbvgMt7eo2W3U+Azbw11UMSFr\nCHRv7lI/tUxUDEndBLMhs393ilp20Q5qFHUJ3teiMOunSJbPQnPKfLhIyVCoVmsIooQXupii9mCM\nUgqqgSRAMI0Is4wkEshJePH6kI9eqpKnOnGW0O130R0BsaXfnwxBYfH0Wa5ff51aXUdIYcWuMwvm\nDIddhnGC2unw9eEM9413OPWZnySMX8IPcmIn5VsvbSH498jEB3mimzffwdbLjIsM3VJ58YXf48Tq\nBep1g4O7z7N76xtY5TZ2fRHdtIkCH9kSWFw6S5FDb3iT/4e5N422LD3rODIOBQAAIABJREFU+37v\nnvc+83znuvfWXF1dXV09S2qppdYswBYQjMGATTCBJGAS7ATI4A9OVsCLEGdlxc6y5YWBGAiDBBKD\n0Cy16G71XN01V915OOfcc8+4z56nfChB1wXahuCs9vPtnnPWfd693/3s9xn//4kdEvkBpqmTRA6G\nXkAVMn7ioWpFNE0HKSFOAn7zd36JvNoAXB57cJb+eMKZY4vEQUBeNcktzlKq1Mkpgq0rn6feNFB0\nwcH+NopRQc0Z9LY3aS4tsXXnNWQ5Ra3PsrVzm8LyaaKewZVrLxImYz76wb/PcxufxIkTVPVoNqyx\n0OJgZ8jImzJfvI96mGfYddhYSdC9KSeXH+XVnc9zauHd7HObsXeHXFpEkkyykYJWr5LfPqCmCgZy\nCTkxGEchnpdQMi0sBkyTNmdX3820FnG3b/jPy9tmLKZhIccB2ThGKRnkNZUHH1lkuDtALcVUJ6eQ\ny3D27CMsqnNEgxGymmc63WW4N0SvCNZH11g8uUQ72ORYfgVpUcFQDUq1GGUas5/bA/+opznO2jyq\n3M+2C2U/ZjwOGHnbZKkCEoTjN6vHbpCRV1L82CUJFaIwI8tS/AiGk5ATrbssVa4XgpYndm26OxPS\nrMtCscqkd8BQhKR3rnOidRJJztjtdym4gqmikFoS9cIyu6nJe1aXyZcWyLQD4iBFEQa+4hE45p+u\n541XBzjuPorQibIp73jyPnx7SJjG1GcWCMKAbNAj8EYgaxQLNQq1JnJhnlprGcMwaLc3iGKXZBrg\neS6u51OrtlB0AzuFev0U3nSfoN9hoXAMP/VJpTKqKqOZPofb11h4/AO099aoz5zEMFXOPPggWxtb\n9MZXyZVsjLJBFk/xfA8hhextXEeWdSQJ4ngDTYuYuAGjJCZnqZyrP8pB+zLbzia26FOXVo/s2czs\nIjPVJWQ/5XLvZTZViVp9lsiLEQ2DW9efZW5uhdSJcMZdHEVjXpTpeBsIoZIcxsSpiq85SEHKQRQx\nW9IJAsFkaqOoMoqhcH39s6Rvjd769hnLxrWbnD15P1ahxHg05M60TbpbxJQsDKVPmk/Imavc/MaX\nuW6a6HqEODTwI53CfImZ2v1caqzw0uZLPP3Ix3n56nV+5Klv5f/4vV9kJ3mZueJFrFrMZCLBPQmx\nan2O33v1q9x/4kFGw0OQ5buV8CRF1WSc6E1jyZkmWZzie4IgDvHDFAmFTCSkQuWrb2zx4KkWwouo\nVTSs5hyeHzC1I7qOza4XUbYM+k4Hz/PpeB4aEeuLq8giI5eAYmikCvzyr/0af+dvfQ+f+Df/F7Wy\ngucPUDXtCI6VIkuILEFWYqJU4+tfuUbCVSpFjdXlCpceepBytYrre9hTm/Ggj3awTrlYo1xZoFCZ\nZWH5LIPeJtPxIVHsEY9DJr1NdLNCdXaRYnGOYnmW//2f/H1+9D/7m/zh558lcxyQYkSiMM4Ez335\nd7j0yLsY9ddw+jqrZy/w1IfeyZc/Z7N1u02zYZAKnSQLSXyZVItQhYafhBhGkSyVsQo6OBrf9q7v\nxc0cejsvsx5tI6EwCm4fiaadSUjgD9FbeR6efS+drSuoVkgQWyieilKdgmKw57xBYCnMaUUOD7o0\nKwtseK+waH2E2myf/a0+1WMN6q6JbOSw0SkKjd1gjWyikUoy8tFOmyPythnLfGMORTLwbIlUsshr\nZabDCmPVJcs1GXf6aBUFX8QYsyvIkcvG/hUWZpcI0n2uDUY40R770uv0XnEIwh1+9ndvslqv445X\nORhcxWzMkktMYP1P9RpCZvXUCWy9Qy7VkaIII2dSLhTwnAgaCnC3iS+NAyRVELgpSSQhREqYRNjT\nEFNVQNGJMsHqiXmSOMZxQqa2T+dwwlKrgS5H5BUNVzIZhRELtQJxkmOoW5BGyJICIiGVJGTd5Hf+\n4Pf44R/8IX7+n/0CqtAwlJg0f0+RLE7JmSoZKnIY40g+apKRRhJbaz693ouEQUQipjQbDU6dWqBS\nO0MU2vS7t9nfuUySKmRxQCpLiDSj3z3kk5/6IkvNOVLFZWbGZNAPKRSbbO7sY1kq+cYKzmiXSShR\nn19ESUx6gyG/8hu/y2MXjoGcsraxyXBk401j9oYuKnssLjawchYik9EMDUsr0W7voZSO89jF99Os\n1MlbRcr6DF9d+9fIUpnTKwqaVOXTd96EcFVJEcUcWpBjL32GXOsEXr+DqlrYbHCydpYgjoEGph1R\nmF1C1LfZ2LdJikX2995ALQYYVY00SJj4PbxwjXy2yA5rJLGJO83QFIGn/n+cwf//U7b2tlArdfac\nTZTYJZdrIkyBkvkkesby/BJBkBHYMs3Dm2y5u1TKLQ6DdfQwh2rKROiYUQ0j8TEaDZrWY3S9GwhZ\nIonnkPtFhkGHextJQ5FQnqliKXOIWx3MkoVhaiRJzLBvk8pvvtJ008KdTPH8mDC+231MMkWVBYqc\nkSSC124eUMgk7CAl8D1udjzeebZFq1FEEiqT/hhXyji/tIzDlL5aRs5CNDWHEALfD9DUu+Dcg9jj\nC198hv/qx3+cT/7mL9EZdnEP3szOaVJKjEwUxagylGXj7heKIF8wEXKKkiWochFZEVy/tc8fffEy\nUpIh6ybHFy2CVDAZB8RhhGwaTKdT5FyOC48eY29jk1vbDq2CRrc3QLr8Mk888Tg3rt6mUCiztd9h\nXtEJ0oiDwx6XHjwFZKzfWcNQc5w5dpJuf4coDjD0EuOxR7V5kkZ9BlVRufjI+7l95QpaTuWhh5/m\nzrU/pu3EjNbfYE0dUC2YZLbFzXH/yLMiiwydIgfuNqaskZg2crmMEe3hRgEjp02YxISRg9BVUKqo\nxNRbMVm6RJsdrCjPdOhQmikwERlpKChUqpTEPOvel4lI6O9KHD9u8lby9nFKWgnO9CZKJpA0mfFk\nhFGY8PTZv81sbpFSocHJlsGT7z2H3lQ5sbIACjTlVcxSnrxexh+3KRRXOAwixl2b7eEVwlhG6GWe\nuPA0d25fJVdtHtHbd18nHPext8YYZQMpp5AGPoO+TZpIqPckA6IkIcskBAo5WSKJp/hZjKYJFFng\npQmTVOYPX90hUTNqeZWnT7U4eXwe0zCZTGwiOWHZqqBoQKQySBKsQp0wcBCyjGYYRN4USchoSOwM\nerz44ks8cN9D6ImOpb+ZcEik5C4rsQSyKqNkAsvUMA0DS1XwQ5e+5+M6IcHQQ41Djs3VOXffDKdX\nmnhuTGu+TqViUp2ZIYpkZqotGvkK3nTKqbP34ToJL94cstN32Dnw2bizw+2NPXq9KUsLK1hFCV0T\nlEt1zqye5/jqeZqteUxLQ8nr1OozzM9VqOQsioUC3b11bMe+2/qvakjorPcS/rsf/wHK9RNkVsLC\nw0+hyzn83pRDN2CpeDRmceSESdDjWG2VWIE4zFAzk0HsY/s+A2+AmwrcLKGQ0wgTh2kwQbHKxJMO\ns+SolU+jZTr7nR30SMIJAupeQvvgRSylzInZJVoVg+7mW88xvm3GMkoStpwh1apKlPgYahERmPzB\n5X9B4A4JPIdbB4d0dnbobxbJojKKnuBmU+zpiGm4TyzGZIFMTEYsFfG9LmmWUDZO8dxXfpX6isWx\n2tGK7LHmx7n5Qp84kyjkClRKVdI0YRol7B/0uRcJJ45jwjhBlTLSLIM0Q0klVCFIENwlncoIFIln\nrg1wpYSKZeLaEZPJXQauaBiSM/NkvuDVyZSDyRamnEMvloiiGJEmyIpKkqRIUkaSpNzY6hAh8T1/\n76eZbbx5LOpKhiUpqIqEJEI0U6Wgq+RNHU3TQMgsNerUGzl8UlIRk2UhhVyRfLnE7MIs3dv7FHJ5\nTp44hyoJtvf2OXGsQr55F1Dd1BWqBQvXmzJyXa5u7NCYnUEzYg46u7zy3PMcHHZJ05jZmQZzc7MU\nSg3mjy2z0GphWSppoKHoJoV6jfmlefKFKma1SEZIYTbPKzduUGyucPvW67iex0udZ9GVMkqhiJ4X\nXB8epWOXkAiJubrzMs5YEEQpWqiROh7FtIIXhBz2dsilQzwpYDBeJ8kSstBjfS3k1tYe08kuo8Cm\n34P9fpuKmOPZg2+wN/YJ45jJgYSqGLQW/yPEOs7RIOxKTPYcpFGe2uwIVxlwYfF96NkCmV8AzWbH\nv8a4cRVnkGe3M8SPHVw3YTKJiU2FtfbLFIWFoQ3wcfCcfezJAHXmGM1qiavPfeWI3tm8wXseeQ8l\nU5BELp3uHre2RghPUKoVkNI33Z4w8Ei8DM2QAZkkk+7OqglI0pg4zfCCkHyujFUwuLwFl/t9fHtC\n4Ic4zpRxFnC5vc3Xu+sMdEhFgZu7r5PTmoRywmTSQ8gGiqyQZRpCCJBVXr6xiz/Z5gf/83/8p+t5\n30e+i4c+8BGUggmKjikraIZO0cohqRCOEzRLot2bUimayJlCwRAcDIYUCzlkRWP14iNkWQFDU6gU\nLU4slrmy1sYb9UnTjDhKWGzoqLKBIhTiLGVnb49uxyOWFBAKOcViOvxmEdEdYeR0kiTCDWzmF06R\nr+TRrCoz1Rb5gkVOl5CjECnJSIKYM6UhW5tX6Hkxh77Ns+u/gWNHlAszTD2FvN06smdTzyEOBXEk\nMRhEBJ5MZ3TAwI2YTCNEqJEvyuwNbOxkSGzoBIZNJHnUFqqU6+cYxwoIlZykoOsme902/iDC9AVh\nKiPkiNhIKeb+IyQzGu3YOAcenS2J/NwMm9ddgq0SegaR0qHWkJFESCoVGdw0mY76nFg4z7Ha45Ry\nMgvFUyR2hplV6PTWGU1lJsMuTtckyNoE2ZS93gFzpxaP6H1p9zNYrTqplNE7GJNEGvlSjtJsidC1\nCe+5JXIkwEgJ/ZCEiEyAKmVkGYRRhp8G6JpCzoDZZpkD1+Ny2+UXv3GbQzPP6qMf5Ph7v5WD+QZJ\ndQEtjJG8mMiJ2e/dZqm2DJLG+itXCNxDJF0nl6ugaCqqKvGZr32V2H5zKvuhx57kwx95D9/13T9E\nNVWRFRPNstAU0LOIE8st7MOQmVqNubkVZpZnKZZrGHrEqLuDnMkcm50jXzJRdYtao4mmWeRUhZ31\nDptb+zQad3u+lueLjKYeS80atXKJki5R1HUsRUXPKbRadVQzR+SOyBGjJDH12hymoVHK16nXyhQq\nFnmrRq5aolquYE/bmHLE3PGHaVVP8OADj+DHQ+xwQBbF2Gmb0PZJSkd58mIxoWrWadRWOTF/P05/\nQLcz4MTsI9hOxmE3gLRKFsoE0wktq4Kmu3TtDjQUZkoq8dhBTUKyOM903UXzJCyrSJoX6HZCRa+S\nBIL9wVtzSr5tAf4H3vcevnH5efJWBQYRM+dVcvuLjDmgUjxNt38HO9EJ+wlFU8EOfaadkIExxbRa\n7I7WOdU6TTve4MnHv5evvP6v0YMSrj/AmXSYLx6nbadsT68d0fuJH/sadwbr/OpvPY+ZV4g9yJkK\nSRiiySr5wptBS0SKmmZIlk4yDMiSlDgTiAyEEDQqZSRVp9cbkoiYhZkqpVKe8WBEsWwwu7qIlGR8\n+3s/yB997etEvo5uCZIkYGwPCBNBqT5D5h0ijdqYaoJSXiSjTOj1sSyLX/v9r/OBb65nZrZFHAc8\n9f4znDlznm57F7Wg8blf/ASGlSMRCrf32hhykcPDDjPzM3S6PcqFCqms48U+e7tbaLqOZeaQ0gQv\nUXnnu97BK288jz1M0LQcc60aRveAcNFkPDnk2OIi05HPfrtNo1xHUUooQsOySpTNeYLEpdmaw48C\nJhMb3dLRdAVZgGxpmIZGlvm4k0PizKLWqPDAQw+Rt0yu9r9IEFaQKj08D8qV8t1p1XsIuCqljGGn\nzTSYIjcDtKKBfKjjeBFLxyOCgU7FmKUwJxG5MuEkwrKaCK1JGO0g63lEqYMcNJHrDrpew0l6mPN5\ngm7K1IGk2aVg5cn7JWDnL3xm374Av/IyrYVZonjAhIBwt8Ga/McM9sc4o1v4mY2eVtBUiVargGTZ\nNGYDDKOPlE040VhEnxGcX7rEdGeH0a6g2TK4tHicxco5DttrlIqzKH/mEj/+o+/nf/03P8d89Smy\nVCZJU1wnxo8dqq0q2T1ANWkiiFOJKIjws5Q4TUmTjCSLGTkRThAxnbgUczmqpQoFSyWvp5w8vYhV\nLKPqCbquoUs63/6Rp3jn42cIQxkkECHc3nqVcmEBtbKEkrcoWDV0d8hsMWSm0UI35pCtNwHnpCjF\nVPOkUczcQotz91/iwn2P8uEf+C8xGlXCNCIxijQrLVwv5eBgSKk2Q5il6JqBqRbotjvkcwWQ0ruJ\nAjmld7CLrjcIY5VqCXZ27rDeG1HTJfK5Cp6XYugatbJFuSQzOtwk8BMyoZJrNmg0FtAMg0qxhGXo\nzC2uUixWyeVrWIUq1VIdTShoQqe7t8trz3yNSrXC1trr9EKb5RWLsZuhZAlj5wAtPuqGXbm8Ty/s\n0ygtU81mcekwd3yeer2KpiwRN316o9tYhUXsLCEtHlJSzyAlIfXSKXKlKkvVJYQmaJXmUWsxVmkW\n4jlUtUr1tGBGOYccSeylbd5K3j6avOw4Vr6N8DSoegRBmZPa/cRouOGQJC0gazFq/jTDgwlecY+9\nXpdTi1UiV6NSWIADjaF6i62DCcfPVNi/0kN9YIedGyMS3WChWkaa1iF4M2c/u9zkTGmFvGHQTyFJ\nYvJ5hTTOoSgJ0T1U4KoiSIDAT0njlEjKSCMIfYXFuTKVgsHI8YhTiVK+iKFLOM6Qc2cfRZIF48M+\n1VqDOE7RMpMHzi7RbJX4wjNXCEOXOIvpbF/j+MnzbF7tsjLXQNcL6JUqge1jJetE+TeZyxRFRVIz\nNGymoztY+bME/pSzZ86wevyn+MVf+DnK2ohyVcEe5VDUHImk0ZxbIUqmjMZd5leWyal5fMel1ajR\nHzrEkkbg95mrNqnVFE6fOguqxMvPXqFgGkiKRBoFFEt3U8+ylMMJJxzL51H1ArVGkyCYQuwycn0k\nSUIzi6TElCvzpIqERIQxs0SyN+L9H/44rjLBiW1KosH+zjZeOMJJBZY0y8i5CvcMSx5fvB/f8XH8\ngES4IId48TbHyk+QpQli2gPjLIejayzMnKDdvY288ocsr/4D3FGX3d3LfODB72Pi/t/sDzeoF07D\nZIjjH3LmYsqkJ1Gv1zFllzj9C8GI7j6z/yEN4K8ifuOLLC42MfOC1DlGdjhmsz3ACX0KhfNksgJZ\nHpGs4ycdDLdKkubY3gMn8RiuvcGBc5PedJ/5OYnIVUhLEbbdJ9/MU6nkkKUOLalxRG/oKPSDdW7u\nfIGJbePYU3q9CY4zIk4D4viebIiiEHopaXAXLCL0E8IoxiMkiiL2O0P6tk2zYHHyVI16OeHYsTOU\nFxaZmWuwsLCKqVhowmB4sMakPaCEy7e+8xSaaWLJOo4aEyQuVn0F2cohlw0UoVKdb3Hh/CMcq76J\nNhclE1I8gsTBsuZJkgGqLCNEiKZK/PB/89OsrC6jyypzS4tUanXyxSJxCpZVwXcSIn+KF/kcdNps\n7B6wdzggCzMefexdjIIBge9hD3ps3OpgGBnDUYDtBCAlVGvzyLKKrGsEfsDUnhD4HkLXKNRn0EyL\nLDgkS0N0zaRcbiKpEioZLhEyIfmcoL17E9VQOfC7LOYbSKlOs3qOGWuG2YbJjFg+smeBK2PkU/Rm\nF3wJa1JjnO7iTQ5R5ApxdoJHTn6Aj174KSR9wmrjAvakhqf+NrVyi4+89++gaQ0WazN86J0fQS13\nkWsBjdkqsnM/unaStfaL1Jcep24cLTXcK28fTd7WO9i5EZA72aNqyuSPCc42TpBlCmE0BNtntnic\n+Qs7nHnY5r/40E/wY9/xT6hZi+yNRlz299EzA12uMhjsYacOtdostUqB1ZkGQegyCt6gPT1a4FLr\nE272Nhm7HYQkkSQC3dBoNCtARnYPNHMmMqQsxU8j/CgjTDK8IKVYyDNbr1EsGSw2Zyk3Kzz/3Ovk\njTJnz5+llEJBy4MSo5oRat7DtCoEUUQYQL+9y8fuX0RVIHZ89rZu05w7zuWbW7iRjKxJ5HJVCtVZ\nFk++CT8bu3uk3h5yGkASoWsFFAWk1EPJXHRd8BM/9d/z5Ie+l0ZjlSDwcKcBURQQhinnzl2kUFhm\ne3+LfL7A2VP3USxWmbguSjRivtFACBUzbyG0BMMsYXtDpNil0jiOJJtoZgVETBoLRv0uSRLdpX5Q\nTcqtM1RmT5JJgjgJEQjIQCJAQSdOFBq1BmG6T6u8zGsbX+PFvaucW36E47MnuHT8aWSRY3a+dmTP\nnrz4YTStyvo1G83UCRODbi9k3NkhDDzCrs3a9g52MISkzGg4pO5dZPPOmOv7/4wvXP1lXrF/mTuT\nKd/Y+Ao5Q0IvZ1Qq8/TsCVP/AFWq8sruv6RZeuAtn9m3zVia5TM0LEG2f5wbazeR4hlsX2DmNTLp\nAGvJ4sH7VghufohweB//6ks/B6rPoXiNs40nacgab/TWONjJcN0CFgqZqhJFBju9GD0u4g9nGfWO\nzkbMNCREquFHIAkJWaToRZ1YJERhBNGbMYuUpQhMVEljGsREcUooBI1qhWI+REozNF3HnU5YXVhA\nNku0FpZQrRyIED2y6Xe3cfsOthPSs6fsdyZk2jIdR+W++dPMlmdx45j97g4iV+CNV75K92DA0B4y\n6m+TK76JfivrFonQMa0ailYlFRFZqiOrOSRJhyjEVCIee/RB/vb3/yAf+/jfZX+/y2B8iCQgETKS\nqdAszZKlMlkic+H++9BzMn/45ee5cXMTQwQ0SnXOrMyx1e2CmkfOF0nSAEU1qDVnKBUqqJaF7TlM\n+gOi0CNNM4LUY+XUEywsXWDhxAVqMytoRg61UEXoMgZTkH2On34Xum4Q+kMkzWG7t86d7vMcOmsU\nikUU/Wib/HC6y5Onv58nVr6DtbURzfoFKloNtVgmJzRKtXn2bj3L55/9dW5vvoZWVBk4DjmOU9S/\nG39k0V3vIGcDnJ7E5hWFLI3YWd8kp+kMRmMMpURJW2B7cBRZ5l5524xl96DDerCP4xV58JH3Uq7p\n1Fsy5YKPLreoVYpcG/8xc/kcv/B9nyIvn+JXPvUJgo5FvC8zHM1z7uwFGrUqrdoDCKmE5wRMHQ3f\ncbGTAcVkiaF7lNd8P96ncnoTz4U0zUhUFVVT2N/eJ4ljguBNN0xOIUwC3Dhg6kb4ccZMowZaTCmX\no1iqYU8mlKp1Es3k4jueQlVkUGU0q0ogF7DHY9Y21tjYbRPERXa6ATudHr3emFiWOb40y6OnzjHq\nd8mbJWRzjmvXv8TO2hX63RHuPWPFQWQgpXmmw9cJgl1EqqJrJrpVRcuV0cwyaaCRBSALOHf2DP/0\nF/4FgeOh6hKmamBZRRozM1QaDRRdJXDHXLjvIo89+igPP36WVK4y9lxuXLnFG7f22Gv3mW9WOOzt\nkQkZTU6wCjUKxTKKbOIHAe54Sug7EIOm6xjFMoZZIBMZQiQ404icopBEHnnDJFcoYLsHGFaJvFbG\nymWYWRlJrpJLFmm23nNkz8bhG2z1vsbqSotHTp9DFgMePPs+DHOInzjs9b+OV7/F+FabQlTEcGbw\nE49b1w/Y7FxFN9tEHmjOPFa6SGj5NKKzaJKP6+2gSXkOBlsEkylq/NYo+m9bgN8d7KIGNYQ+4voV\nm9OnBZX6HHsHV5Ezh0b6CAfjW9zZG/CF9U8yzF6joV+k2Mzx4JmLbH3+Gq//do/CCYO5hSrTSUCs\nCHxlSq6Swx54tJN9TlTO8Dqbf6o3GSsYJYXGaZN4NyTNBPXGPJPOhCyDe2GbP/HptT+/8P3Bn//s\nhW++jX7j83/Nu3LvVOfX/9y3QWAjyQm15hPEQRff9hkfbqObORRTQ9IMZDVPiozneOimjpkz+J9+\n9hN88rd+nXJNsLvZR7Nk2lu7JGnE7Ru3ab7rcRx3xPnT9/Hss7/PyvwlvnGrQ9HUqC/nubPTZji0\n0a0d5mbrFIsNxmMbw7QYTw4Z2QXkYZ5mcwHXdZEklYwYKcsoFWcwtCmhN0U2CuTNEopp4vguGztj\nzj90if0715HUgEF6g3xB43BPP3Ld/YOYrLbJaLjNwTTgwxe/lSvXtpByc8y0dMI05XC7zzs+JHhh\na4uH5ucIBjaqLDORbqFkOlIhz6C7hrXYZFaCqlFiZFQYJgNIEgrqMpFYx/Drf+6+/4m8bSfLIEzI\nFQSGWUMWgtvtNlfufI79kUS3M2DcdWlvucjmVX7t0/8nvqMyjbdIwhZfuPwpLp6/nwfffz+t2VmG\n4ykhDtUClPIa1WIdJInUifCCozFLWX6AUaeKNxKESJiygm6k6JlKkgkyVX6LFb/9ctDeRlJV3PE6\nkWRh1ZYwC0VMy8KbjAltm+mggzs9RIlTwtGUwPVQVcHf+I7vZHToksQu+5vb1Mp11jeu89R73ssz\nz32D5WPHuXPrNS49/A6GYZ+PfuBx7r/vNEtWEUUrMNfK8+rVGwzHY0gFxWIJQ8+hCoMwVonjlCAI\ncFwXw9DJUBkfdolJURUFq1ynNnsGLV9DVk0E8OBKg9euf4Yzy8sYJQU39shSlaJ+tJmxoufob+4w\nGYxYWF3kzn6HxpxGx/lNnO4NPDaYXZ7HlWc4uXCKsXOAE3ZRQ5f67Ig410Evb1KUy4zbfUrh/Vzt\nbaNlJZzdKflmgaXFRcrKKuX6X0j6ALyNJ4tILdruITUzY8Immu3h2FXmarMkueu8MvoSTWuOad+i\n0iyRiwW2d8jy3AIdu09vGDPs7xBFLrpUIVdRGQwGtIpz7HWvUtRnOHlilSDJQ/fFNy8471IftkjT\nCjq7lCtFxocj/DRGUzISP+JHvvMsbhhhj2KCSOL1zSHHj8+Ruhn1uVnKJYgDn0a1hVXKoRsFzl68\nhKHrhKGDkHWyNMJ3R9y+fhXNnKfX75C6EaoUcDjwqJZyjH2fRnOGKI4JPZtmuUS+XEJWVVQEqmGB\nJmGPeuxde4OFl58lS12Wj61gUCWIXbIYhGWQKzaJ/RhJAUmSiDxjPB5JAAAgAElEQVQXSVMg9oky\nCUXReOdTH+STv/1v8dOIVEnQlByKFvHep9/Nxq3bPPbwJXbbO4x6MY2GxurSaW5tryMUn5JIWWgV\n+fwXv8F3f3uTcrl+t6PBsBj3Dzhx6n4cd0Qap6iKRBxBe2uHSmMeNJN8zsLzIlRZRlU1PN9lEPVp\nmAVu9F7Fnrgst04wDhIOvKMnuq1MWHiwybU7t6hPTyCVNznYn1LVLzBJRkiSjp4OcdMpZ5ae5lb7\nGbLEo7WSZ9+HQjEj9uvMHXuMctohVLvU4jxmaY6HS/OsuZ9lbTxhFB4yL/5COiHgL3myCCFkIcSr\nQojPfPPvqhDi80KIW0KIzwkhyvf89qeFELeFEDeEEB98y39qrnOucZHRZErYMyjlmjxUfxfhOEQW\nM+TFBRxHA6FgJxFGnKGHOhvrL9I9cNlr3yTLIrI4T5w4SGKEpcv0YgknVBnt2Ly88Qyj3lEC1mAU\nEVWm5MMUPWcgawqBF0AmkcUykqQiDIiijMlEMJpG+FFE0cxx7sIqx1smhiJYXDqNls+xfmed0xcu\nYhgqiLtEp5qq3IUJkhTK5TyBs01Bz6jUdPwYcgWdQJJRZZ3Qm5BENoVynmK9RqGcw9BVhCKBfLcI\nmjOrPPT0x+i6Cb2dIZLeRC42kCSBQBB4EzISNE1B1w1UWUNSZNLQJQ49kuCQJBzSaMzz3d/z4zz0\n0PuZjiYstgx29u5APGbqDnnx+hXGfZ/m6nkmfobtHnDpwuOcWFxGs6rsHjho1QqvXnmVw2Eby8xj\nGHl0XWN36yamkSOXLzO1D0mJqC8ts715m2KxiKoa6LpCHKdMxhOSNGbsg6oeRw8tFE1jGB7gDXpE\n6dGxYuISowODZvkEA/UyyVBGlCu40QE9f5f7C6cICx6arrM1/Tx6Bkqg4CcRgadx0J+hH6yRytt0\nRq8zjvex8hqp0uF25zZeUCTrRRhpC2f6Z3T/VY0F+AfcBfv+E4f+r02TF+1U2Ixvkq9GLFyyMcIG\nS6USj156nGK+iZJOKEcZUqqxN9xEmIKCdZJyfYm0NyFRfISc4Rl72KGHKRaJwzyxewfF8Lj42Dtp\nqE1GfyZ1vLnfJetX2WIPPwzZ2+8S+C6aKSFnMqYmoco6kiRRsMAjZXlxgXMPPMDcwiyuSGnNniRf\nyqMIiXe///0YlowiII5DRBYwneziuUPi2KdcMFleXqRWUQj8A4p5iXK+yWhkk0Q+kTNAxaNRraCo\ngixN7k4lSgpEGVkUE6UK+VyZ+y69gyvX3+D111/AH/Zwh13ccYd4aoMX3MXlkO8yD4NAZDq6USOZ\nuCiBQNVk6i2TD330o/zIT/yPpHqZF166yWsvvEy5tsoHP/aD/MP/5V/yQz/8k/zgj/40enmGJ979\nBKZikGUpy6tnef/jD9OoNrh96zUGg12yMLgLXZTEbN54DUnKCKYhILByeZ7/6pe4c+My9nSC4G46\n3pmO8V0XrC4LK1VGkktB1HF6EZKisNI6mr619F0maY/haMjE9gnUMaprkBM5Us1i3dsnHtoMxxvY\n9h2m7hrN1iJq0CJLFApyiDap0xm2UYoe7TseE/UqsZ/wsfsfYqG4TKl5kla5iO3u81byl2H+WgA+\nCvzPwH/9zY+/DfiTlMUvAV/5psH8KU0esCmE+BOavHvZwACoFg06m12adYmCVAc9Y0/uMdpsM5xO\n0BSZ40aJLWmAcBc4f/JJXn/9q0TJLOcuPsJG/HmMSYM48LEWyjjTMXvTHnlMCoUmr73yWRqzC8RG\n54jeudlVep0bXFz4Dg7jzyEpEbo6SywchBwgK5AlGYQJTpzxyOPv5MIDp9nd2sOZhCytXqDayBO4\nAY16jULNRFEkZMVCkySiKEVNIpLIQU4jVMvCcaeILKVYKNAf2Hi2D+4IqzVPsVwiDG3c6QA5KxPE\nAlU1kaWEOEuJ4xQUwfb2DsVKlZ7TJ0xDNto3qIcWRqUEQsJ3XUwhEQYRSDJWsUyIh2oYyLVZ0sAl\nCyYYpSpBHCFrBj/xj36eietyevUYjmvjukPSNML3x6g6/Lc/9T9QyhV57B1PQSrwgiGvvvAStnuI\ne7DHSy89g35eJ1eoI1QJESjY/R5avohp5Nl+4wqqatHvj8kXDrm1fZN6a5nRpAciwukWGEhd/tMP\n/Dy/+OmfIUpkJHNEZ3cd7klK2SIgyd2maTYwpG/hua//Hh99d4n2rS1WTlxi++A2ktpDuaVhnp/B\nVU1UcqSuzcjz0LKzbI2usDijoyWLrJ7x0bxHuDP5DAtzjxGGDkbRJBipzOQLvJX8ZU6W/w34RxyZ\n9ODfRZO3e8/v3pImz6rmmZ85xeFmQpYVuLBwiVzZQy5JiMRCkYrIKliFCkZS4PqdLS4tPoCmhVy7\ndhtnv0moeZRrVXJSk8F4ypnKHJ43pjeaMHf8EQ73h7Bz9OKH+zfJ5XSG/TWUYp5yocH4sIOqyiBF\nREiEUYQbyfwnP/FjnDt/joP2mCwOmV9eZn6piTsZocgSRtGgVj+GQEJIICRBmkCYeMSJRxb1iUKP\nzInQrAK337hJd+2QN67fYOwFtDtdbq3dpL23Q+i7pIogywRh5BGmUzy3j+f2mI63uLm2iaEZPHjp\nKax8DsULKR6/n3TgkmUhpqkSRzFCVtA0jSzxMfI6WRqjqDqKpkMmEUw90uwuTvLMbINTy4ukSYqi\n6ghZRkVGklNKxRolqwqZgioyPG+APRlz/oGHkEn49r/7k9Sbdb70pU+R+kPiRJDg4UYu7e4BGxt3\ncN0BnYMOm3fWaHd7TANYv32F8djjWu8bVFsVuv4Gl299kXI5x3ve+3EWaxcxj02O7Fm4W8J0HmSU\nGvi5P+D0aZOb+wfEyxOcaR+7O8INS7ymeHimxcGwy9X2C1x1bpPnDKQ+y9o5/EmMm27hTlMKlSkl\nrczu1k2KjQJCilmYX2Zz5+jL9V75dxqLEOJbgIMsy17liK2/Kd/kX/kr0+Q9/5kbbH6xzfCGiuhU\nOWgP8OIEu7NGpbRC6AX0xwd0+zHHz51huLVGIns4/pDV+xoszRn0hgO8fszhYERRaLTHY/SgyLQb\nsfbyC0QouP7ROku9WKBVvkBP+jpqSSApMnKlcJfmOhZYeRPfC6k1FkjcPt3dq/hBj3K1iheF7O1s\n05ipsbA8h2WaSEqGkGSiOEDKQFYUclYDVTbIUhlnZAMSgediWhapLGNkgslgxHDQIUplRhOXSb/N\nuLeN7ztEvofvpkipRJLC669u0jvwkTJ44okPsbvR5/KNa3R2r6IuLyArBu60h0SErAhSSSbNAty9\nyzjddVIpIzezQGVmEbOYRzc0DMMiiROSLCMTIIiIYx8hCYr5WfK5IkkSEUYOQeChaQY5M0eSurz7\nqe9kZ2+Xj/3NH+Hc6bN89fkvE9pjQBCnGrpQccZj+oMhqqayvbVFFPmUqy300hKWLPjs5f+HYAi+\nJxPrFmkCE38HVZ1Fjd91ZM+KZgkCA5053l38eyTKHBWjhhzfj5ArKPkim9d3eeh0gbVrt1jNneFM\n4378ICOYgtd2mAYRihLhdOZpFU6TaB717Aw3Br/PK994jquf3uCzv/NlelfeGrHi3+eGvQP4NiHE\nRwEDKAohfoX/ADR5F74rT1FKKOfOsMIym16fnNIkN6+QG2UkhYC1A4cV7UlqtVM8/HgVVTtkzihi\nFle53u5ghwnpeMjM/AQnyyFtSyw/cJrtOzuU5kN6Oz4PP/YUz4/eRHXsdDP8Wy9zbPU0035Mo5wn\n9CK0vIEsRai6RnnmHO/68PvY39zg0rvfjdd3yFfyiCjACyLKpRaes00augRe7i78qZSRRh6SZOAF\nARIJslEhFxsEWYiIMorlMnu7G5D6aJnKzNwMMglyo4KQVSRFJ8tCklgAMWkiISsRp07UmHo2zz37\naaTY4Pt/9Cf59X/7z7EdG6m7y9zscdAKxNGUxEuRlDH63BKi/DCGkqe9e5tCpYHnjUkzwd0wMiNJ\nItI0RZZlfN9FUzWiKEBIgsGwSxj5FPIlZEkjJUOSdaQ4wnf2qBdraK0ajzz9tyhc/iMkPWQaW5ix\nj217xI5H5AeUCmUG3X3at2/QWl7hD/7gN3nfR78Ls1ZDpBMahTl6m68gxTLb7RcYbUvIzaMVfK02\nopCkIDX53evPUc+foGxB3TqB73TJYo+H36eTR7ByIuKNl6/Rui+HmgpCWyDrBnVDZjJZwJD3GdtF\nzNQhtCZUpsfQVly0hRpPPfYDPLv2z7n2pb/4dPn3kRn9DPAzAEKI9wD/MMuy7xNC/FP+mjR5zeQY\nqiIjAnhxeh2tCqP2dSYTHd+4ieTrjIVPsXzA2st9DEvh+f1bpIlNI/YYjrdZKSyRiJCDdY/6ckD9\nTIO1N65yamWOUVRlZdZkd/fyEebbB84/yfPPfB3XzlEKI8aOR2PhOL7vYKg55o+dptqokSUR9507\njyioRHIOTQ2RlBxGoHBwcIP++g3mFu5DSBK+N0IAaaSQyWPkJCMIU2QkhGaShh6RKFBppZzTJYb2\nFEVSkaOMLMkI0yH1+gKGWUIkCUmSEoQ+Ig1JFYModVHTBM8PcbwBv/pL/4r5mRLba3ssLYXoeZNq\nsUGGTs6okMgJiaySkdHt32L9zhsMx4dU6lWqtVV0HfzAISNBFjJBkOL7Q+IoI18wMdQi1UqL0biL\nLKnIioZERhI75HIFwMBzfeIk4h3v/CBPvOsDfPULn4SpQ2/PZzruEgQJQeBhOy4TP2Y06BPJEXbH\nIU5SJCHIJBslk9hxdlhaqhCOWsws6UyCo23ygReRlMtIvoHrXaXv7aHMPcaZ+iwXL/wNfuuFV7m+\nnpGaGxybOUt9QWft5jqBo3PmvkVmjWN0p3eYHExQmlA0W5xZucBnnvktPvDwR/nalU+hqAqfufxT\nEJZ4K/mr1ln+xKX6Wf6aNHk5b4ZxusHxxRWG3haHhyPsdhezeJJ0NEbTK1SqLbZGHS7f2qFVSgmk\nPHOLTRw/4sLie9gYXsYN4eLjTdZ3D0nxmDmm4mYjpDTBsYvY1voRvVcPv0RtXkNSDjk1c4YLl57m\n6msv3W1ZXznN2JvQyPL4oy43d7ZYPH6KnGkRBQ5FXQVtSqlY4uWtPWZPPoGu6WRZDuIIRdMIgxhf\nmHfb+0MfKQ3JQoGRl0m9EqVSGaO7z3Q0oVCsUchbDEYmsfBJvDGpLCFLoKJwF84sQaSCKEoJfZep\nr2HZ+1ztJyyfKmD7Dnr/AEVREFlMLCtoko4W33XJZCRax5apVGrEkc9h7zblUgNd0/GTGD1LGNsD\nFElGtYooioIkZdjTMYah4nk29nQPTTMw1LuuWRC4FEp5Ij9mOh1QrTb5lm/7fl569VUUSearv/vb\nuN4Ee9TGc3x6fsLs1GDg9Fnba/Pa9T/Gt7YpJ2eZhGNmaicQXgUn7iAJG88/6vF3gwnp/pi58v/L\n3nvG2paf532/1evu7fR6e5l6p7EMySGHRZRF01ISFUgyJMUOjMhGAhg2giCglAApyIdYsQM4sQ0j\nsgIXmqqmRFHsnMKZuXfu3H7POfeUfco+u7e1Vy/5cAUodDgwkCgYBfDzaX37/xfW+2C9/Ulx3BBP\nDdkQQt5970948+Zv0o8OqRRXsbJFuqczNCshl9OYzy8wPR1xlLWYKy+zdqlCxVjisH2dw61HrK7V\n8VoielAijI4ZuHlE6Yfd9v9HZMmy7DvAd/70+f+1TN5e5/uIchG3mLGU32S8u83Khs3+1ox8XYRw\nTKN2jc3KIoPZHzDtjdALIkeHfYq2RFKIyEtFatUyirnK8sotZj2VSRrQ3k0QMpDzHfxxGYp/tgI1\ndmRqBqhmif7eEdNLPs405uqHPkEWydSNEjPPxTDnWF80kAWR5u4tFpfWHy+wCHzc0ZCNS9eo2DXE\nLEMUUmI0gmiGYRVRwhRPECB93A2gRwZSLg/SmDgKqZZr1MpF3MkUWdCoFqoE0TFh2EdERlUtkixA\nknSkNCDOZCQ1xJRsZDNhFkaEgUmUePTahzQf3eP5lz+LogQkqFh5G2faxtBsnFmfwWkLUfBQJI0s\nnTGbRgSKTRJEPDrZZfP8E8ymE2TTJs0yBFFEkVWOT7YoFVewzDxZJuAGQ9Io42DnTQ6bh3zisz9P\noZgnTRIiIp6+egUEiSQVcCeHNPd3+MZXv4Y7GSAkNXS1wqXL1xhqbUryZYaDPp4tkbo9ioUExASE\nAEH8YZmQJEjRdQPVyKHrNiUjx1SaYRVz5NIinZ2MdjRlfdXg8M2M8uKEmeexsbZMZ3IfbSIh6zFO\nlNEbfpsziy9TWKmAP+I0vs7Y3UaXTeYwGSTvH7NIX/rSl/5ddv3njl/7tV/70me/+GFSBDJRIhRO\nsHMlxt0I9/QUR+pi5uoMO9tEQpft/QPUuRTNytiYW+TS2ctsHdzhidWPMRIcMv8m6tTmaDpFiWPc\nnkthTUQVK6jAnvVnmiur7TJlWaOemCShyOaZKyytnyeniRA7lAoGgqoQ+2NyRo5YrUDgoWs6uw9v\nkKYZdrlBo75AnKRoBiRxhibJIIlkiUiKhCbLkIEkiSi5CkmUgeAhiSK6ZUAKciZjFQqoskmKRxgK\niMLjyUzVyJHEY4JMJvA9skTBdydEfkqvPwExZW//kCAUCGMRQw0xdIswcGjt3sEbndBuNxl2D/Am\nR0iyyIN7d1HiAHfiI9oKWZYiyilhHOHP+iCkDIfHpImGoolYlk0YRyiKTgpEQUi1MU9tfoNvf+33\nmfk9FCkjTmKKxRoZCY4zRohDFFNm++FD0GxmgxFDz0Evn2Xh2kWMSoY0abO4+WGS8SFFs0EmJPiJ\nz1rtSXb7N9j/v6jmnhVr5FWNMX3OlM4gCzO8KCTqTdFFi5JR4LlLn6Q/zAiMCZHuUdDzGPrjkQPC\nEWP5hIq2ytj1GLQOkeMp9x7+gEweohgZLjItZ4zrOZy8E/ClL33p1/5tu/3A2l2Wa8vcu/M2mxtX\nmXgm90dv0n00JrXAjIocPopYXzzH/sM7rKyu4py6SCWf9vSInJdn6dwyB5NTypbJs9VfZWVpm1//\nn/85EyVk/uwiLeeEslpnlvywRmAQ+Twcu9Rsg6JdxbAFVDHj5HCLLDEYD/fxZxK33rvBpQsXWF5a\nZDx1mI67rF58EcNeoNPcQs6JOKNDjMI5Qq+DWlhDzTL8aYik6AjCYzlI1wuR5Mfn6qpApqikXoRs\nW7iphJhGqFoO3zUxVIMk8pEVEyFViDMbOXWQYokw8pATBbPeYDzcZzgYU6nVmA1OiKwyjw56JCnM\nz5c52B+hKiNCb4ozGxHFCotTFzkz+N5r7/Hxz7xK4maIakwWZjSPdrl/85sYeoFnXvgMqXQIwgIZ\nIaQSg8kUSZQxLIPI98kkmWdefJXvffuPefvN6/z8L/wS5dIyCDHDzjbjcZ/Vs9f4zOc+z36zySxy\nsIwCSxcu8tM//knCTOFv/nefZ97u0UpiqtFtTP8ccrGDU/gtnrmq8K1bf/bNTFNiOhkiJSbklkGQ\nsQWdR9ObDDii3wzpRgcMvC0IReaM84S6Ts0qIdef4Puj7yLvV+hX7pMrxqxcXGW+/GFOvSN8aUTV\nvsJm9SqDwV1k4Zh3mPCj8IGRxZ0NQJxwd+8mUXxICCyuXmLqnuL6UDDG7O+00VcF7JlHrMNotsez\nKx8lHTTZOR1Srb3Ce3f+JU99epFb7x5wbfMFDts77HgjCsMKx/4+hq7/0LlJqPPq8x8nuPuAwWjM\nyaMDHHeCUTDY39nBGw4oFAw+9ernmV9Z4pt/8GWefPY5TLNIEMfE7T0EKWDr5n1qS8vsvned9fMX\nyGKPNAnJsghNtYjCABDQbBMZAUXL4QcuSpLi+jGC5yGikwkiomJSMJfxJ4/IF4sEfgxZgoiMgoGb\nPh5BTrOEwcEWEylm48I5To57PHP1Cve3t8iXqtx/7w6i+CLdaRcpVYm8ISQhBx2Fk+MbrF26ghQM\nuH39DpqeoWgRxco6spxhVc9x6eolcvU54tBDFmL6/QG54gq5Qo77d96mYClcv/4ml86d592bb+KG\nKc889SyWVSJIXG6+/q/wI4PK3AL93jFZCifNPf6TX/5buP6Q3//q18iST2Co8A/+zu/xy7/+PNVG\nnrFTJLCOEZgxmmbM5X/YLE9bI4p6jefPfoG7O/+UVjChZm8SJD7eYIQ4q6LoMnPRBurcBudXXuTm\nw3/Nfv8G9/7wX/DJn/pJFuRF7jV/kw8t/ww3Tm5jl1KefHaD02ZM0HIIjD6B7/Ho+P23u3xgbtjC\nesrwNGCWesySNsppkdjOSNQAU6lSzJvEZorkGriRR01bfDyS2u0zbk0JPJ3N2jJbrfe4c3gT3XyG\ntVqJb7x7m9mhg+OGiNUqhqTSLPyZG1bY11BUj2ef+CSXnr5GEvrEgkKSiFi6zdWnr3LlyRcwDIXm\nQZtet4UsWkSJRBKOUGQbo1SikC8yGrWxyzXC4R7jQZfBdIquysiyhCDIZFJGHDgkUgJJhCrKJFmC\npmsomk5KBoKJEEQIkoKWL0IoopEgSBqKauJ6HpokEyY+k/GI/ZZHpXaGi5vrlBsLTAcdetMAS9c5\nPWzSHQ4YTiZUqnNIssF4MKO2tEipWsYZpMSJx72tRzT3DxG1ec5eXkMQy6ycOUe9XOe4+ZC9nbvc\nv3eLe1uHhKHPyso6D26+TrFUJYodOkdHKKrIZDggCoa0mluEsUe3PaZgqVi2QJaCqOTp9dtYmoUo\njjFTh0tXPoYoCKRxwsQNeDT4OtNRynjiU5KWkL0Ce8cyj7I/k8ornujoOZ/JqM8oHpPTUgR5AVGd\noKoVXn7pr3ByepMnll9CUnJ4fofxoEexfp5yucCwPWQsx9x/q01uTmQYHnN6fA/fneJEexSyEoqq\n0Ekc8mWdu187+Yvlhp0EU2jkCEKHaUdDnU6IcqcoqU6xaDL0TlFmKpZRxx+79KL75OICoqRwJJyw\nlDPZOnkNNRVxHZM48flXf3IL8jI5ZZFGpcHuzi6zOfWHzs2bJk+sfYJizqBUrtDcu0cYZxDrOKM2\nWTAgDV3qlTLFIjTq67TGLubMp1jUKZQaZEGCma9TSgIU3SaIXQo5mxiV6bQHkogoWGhyhixBFAVk\nYUSkiMipSioLJEmGJgjESCRBEy+TkTKBNBRI0phMFUjCACUJ6Q67dDtDWm5KKio8e2WNo9YxQRBy\neNpjfe0sO7sPkaUQaTpicfNpgtmE+cU5LLtAEgvIUo7D5tvkLJFKo0rie9SrOjnV4Le/+lU+/PIr\niBk4bsTTz/44mm5x4/q3KeQ12q1jVs5dpVpd4OBol0iM0QIPVRfZ2R/TqCT4wgPCzKfTG3NZydNY\nLLK5eR5NsfjaN77MtYtn0JWEkAAxTbDtPJ98/qf4ozv/iIJqkcgalqTTHA3ImeYPCVCBROKr5Irz\nhHHIYNTlQ+sv0B9JjLwJvdN7nF39GP2kRKd9h9To0go7nJmUidji6JHK+XmX/ILGQXeKFyRE8Ywl\nu87pCCp5i8A4oaJI6Poi8DY/Ch+cG5aCMAwJTQ9RtcivlzAFG30hoT3aI5fVicwEfzJlfukM41GT\nJFPwU4HnF15hp3Mft68wnuZozG/w1r33iGcuyAJifsjOwZT5ywoj54d/q69e+wRXz11kd/sOgpSj\n3thgPB1DEqLrG7Q7J6wKGqpRwZYaLG6UuFpf4t/8i3+GIVcIEpUk8giCGG+W8ODW6zQaFTTZIMlC\nbLtBGqfoeZUsSQhDUFUT0oA0SVAMlak7JvViwjhCCgVIbUxRYOYMcT0fRdcRvRmdVhc/ybh3NGHm\nBwihRK48x9vX38bMa0zGKUvzNR5t32NlbZULG5tousvu3oi5+RqaoiIJOofN20iyzYc+8iG2HuyQ\nL1i0e6c0j9tcvibx8c98ntDXufPeDZ585lkKpQqKYVKrzVGqzDM3v4wzHWJYBdbPv8g//R/+U6qL\n8yyUqiDKmEqB2HcJvRDJkAjTmMbcMr4XMjc/T+pPOeiMufzsT6AKKkgpYegxX5/nTHWDODyHG9/B\n8UIWG2dJp/+WmJFRpFJaYzBrEcoOf/XMF+jZ81w59yu0+w/5o1u/gz04QcnlWFFrGLVnOTcHllhk\n2xHZvLJE0PKYGLfJkFlcvEjFLvGDrW+g2LDPfaSJy86tmA+9UOL9ILxPGeT/UwiCkP2l/+oypycR\nhibTbw6QjJgsFSkv5SGFTErxfZe5yhz+MCYUwTbWONi7i1GZIs/qJOM+YskgFSRUS0WUMpy2g66V\nQesg2jLhVOIHSwf/7kv9e/yFxX9z9u+ydfgm077P+c01FgKTLTNGQ8Qd7iBIkKYVfDkgYoQgZtjW\nAMdT6Jw6XDjzceLIp3tyQCJkDJyAQhU6swmik/Dkxkd46PwRc/IV9GKBf/wrv0+WZf+39q4PbFKy\n1Zsy8ydM/S7VRo7Ujnnxo38JIdUZd8c4h6DnFcJZzGDkMx0OGDU72CWLqnWBLEmI6yJRT0W0E66c\nv0zqd5GrKZk5Y+RHWCyyMnfpg3rFf48/J9zcfYur5z5ErEr0evf4wfQWZuyw1XlIaiZ00wleInF8\nfB9FWWHmDQmiM9ilHHqq4s2ahM4DPvzSr/Dhpz/PmdoqWZSw0jjDy8//IvujDlX7KTqDU2r5pfe9\nxwfmhhFoxOMMChldcYYxKvHd1/85uXwDUgu9YOK0TkmwUOUcohJjJQ59I6JzdAiZhjHL45Y9pEjh\n7o33GMQpxWpE4kfUi5sks4DDzqPHTTf/Hv+/Rawc8P2bGetnVV7/9ilhvUff65KSks+/hCOMGST3\nOXvhIjllxslJDzWqkaQSU61LtjugVH6OB3e+RbGyxLnzLzFuRihoKMKMZ9afoDW9izn/FLfe+vb7\n3uMDI4s3bJPERaraPKkfEOVmaNkqoppi1yROjptc2rxAOxpiSFNmnQR5zaXsFOgzZeOJCzSH++TD\nAn4nJhQSyrkaBzdaLF80mE576HKeXHWOn01tBtMUsyTSuqP2fD8AACAASURBVD3lL3/8M1Qbc6Re\nSP90ih8M6DRPWFhZopQv8GBrn2dffJXVlRV0w0TWM5JIJI1CVNUkTQOCMCFnaARRTOe0DSLs3HyI\nbohIsg+JTM7QiVNI/CGqIpIkKVbexh2MGbTGRBkMxz2iNKXT67FQXcbO29iCgeMHSChIgsTJ8IRB\nf8T6U08wt2Bx68EWdi7PYk2HdMxvffkHXFgrctAZ0+17jCcxsqJyfsXkxY88z1uvf5dSocJTTz+F\nkbeZjPqUixXu3b7NYc/ll//63+T6zXcYNm+xv3NIY7FBJlp4ozb77QkvX3uefF5FU3Teunmdj73w\nJAE6f/TNb+J54AYqeUPg7MZFVtdtQj9hbvMiT12+iGrYqEaOydjFyudZXz6LJMnIskwQ+GRZxn//\nT36d5vgt+u4RU60Nbo16dZ2MHl7kMmcs0kpOEf01Lj2/RNJ9mpKRcRB3cNMQN2qRE3IkzjuMpHnG\nJxHVhQnDkymIJdbmX+Q0eETn9jb5y2W29iRWahvsen9Ac9ol6Gic+K9hGA2k0p9Du8ufN1YvbNDd\nmzIZ9/DbAfa8gqVLuMcDciWBsqVwfP+E6tkGVbvImG2iNGSloqPkcnS6x6iBhyIHpLUEzZQIBjHz\nZxZI5QDZU1FFl7IJEzfGk2co3jwLaxZJGOCN2jCJGI6mxFkIqkK31SSc5JAQ2L7/Hmsb61iGSb+7\nw2TqUl3YxA9dVEnHnw0xjDpRmHLwaAtdU5lfX2Tv3jZx6FObq9I5aaFLoOVVUlIkU8HzZoSCxMnw\nCE2yCOMUIRFZrM0hywK2YaPJeQqmRiTERGnEmbkSw84xkgT7x12STOHksM83vn6XjfUG1WqBxaU5\n7h5OESSDmd/GljJ0Pc/enTs4oUo48Gl1PPzdLb731i4LlSL5gsRp2+MP/uX/hm1aHDRPqTUW8MYR\nlu2xcf4pVHWbt+49pFpvsFBMsWWRxC7x2nfeIBUM8jmdQs1AlBRk00JIC+jWjJIu0T5uktMMxFyF\nXMFk0D9lffksURgjICLLMmEYUrV1moFKJkMpyjNNFGQnJtRN8plMkojIjkon7hDEMNEeIjll7MzE\nDzqIogxOgNT/KGyEvPhSgchrgNMlHXXol/dZNDbxrmhkhsckVpkEA+ToSWZ9lVhpM1es4p+YlOeu\nAF/+kTb7gZFl1nPQTAFbMxlUBGwxj+eOMSvzTIQBBArGWoiiynSCJlkMkqtx7Lp40oRCcYmpI9EZ\n9chX8hAFNJbO0D/dR9B9fA/GQcykO2aleo5k1iOVTNaWl4mQyWYxSRqhIhOmIWGc4IxcolCgP3VJ\njsc88UyLvGay9eAdVtY+RBD4TNpdps6AfL5Iv9PC9xK6vS7XnnuJ7e1dTvYfoWsyqT/BKOaRZJ1x\nc59UAllVKVYtnGmC44uISoKtiASpiKUXsfNl6uUGimAQJTGWJTGeDojEhMLCBu++9wOkfES7JdAo\nj6jVNCZTl0Je5bf/5Bbz1SquP2F+vk4pZzHLdMoFm9uv7WPmVQ6O3sDxp1zeXObaU0sgqEy8+xwc\n91AVhzsPh3zxU4v0gwg7Z5O3NK7fPSUVUpbrGnn7DDduNDFv3+OJS09zd+sW7faAwWmXF669zKPt\nbcqlCwQTEVU6QlNFvvJvfsB//Et/ldO9Luef+zSOM0ZRNbIwQtdNNE3jp3/iV/m9//ZfIysSc43L\nVHMZw/4OQlZHDaocz3bBTAgSgbyco0AJV5PIyWPCWQ450JgWuzx6912WrBql2oRZEKKVAhaLS0RZ\nSGf6gNTPkNIIPa1gqCW85B4l43NsN9uISh1N89hufuV9bfYDI0uloTIY2jTmRIzZjPHIRVV03KSL\nKhrUaznS/JD+6QCt4DNfNZh4PmI8IwlUBtNdpImCKRhkaUpJXOS4u4MoyFSiPPk5hYnzWCK7P76P\nktWJo9tMOypPPf8KmedxsLtHKkMWJMRhRn8aMvUSDo5HPHH1Gr1eF1X2mU4yvvet73L5mcs097bY\nu/0emlGitjRPY26ZfL7B0sZlYtnk9a9/DUFQUOWEQilPQoIYCRhKijcdEQug6TbNVpdavkZeNsnl\nZWzTomjl0WwTRbWQhYxMjDG0AulkyszpYhQLjAY7HLUc9lsCuqwjqy4PbrfZXF4nbymUChaqZeEM\nB8ychMW6xXy1TLUs89mPPc+d3X3yos/x4YCAED+Tqdkmo3FAtahz2HZJwoAFq8RJu838SpHdnRPS\nSEIVHdY2F5AKFbYPH+COPDZW5rhwfo4HW/dZXd9gOgUzZ7DdPMSdhvz0f/QJpm4LzS5ysn+TNJ5R\nLi8xc30W5hcxTZNSrkCOAhlDzGIDcTzlyFdxRqcUtQHlRp3JdEwiRAzSFFuKyOtn8KaHqGZCnMyY\nVxYxXxCIx3sE7iq2rpEOEk76W5w79yS9SKDvtCireYR6xKj5iHv7Pa49sc3a0nn227eIjQRjuAls\n/0ib/cDIYhrrXL54jvfu/D62olNbWSCYzvCFEmE8RhFC/EigtiCiij7jiUnZrnPiR3idNkk5Q9JU\nGqJOezxiz91FKmSoaPiRRThxMOMS05mLb8UkkzaZp7FiBuzvbZOGKVHg4LshshgTeROG4wGZoEMm\n0O30qNbK3Nu5TXPrgLMXnmHUG3HwYIdbBy0y95CnU5EkiHnlCz+D63TwxwN+7Bd+ia/843/EbJbg\nekPiFBQpxQ1iMiEj8COkxCFn63gzHy+esHblE6ydP088nYGYoOkyYTDm8PgY1xkjyApe6CGKGfXF\nZSSxQ2LN8eDebbypxLn1SwhKxEmrjW3LbJZUukHAZBZCpPMf/uUPs1hf5Ma738cWTIRU5N3mERcW\n5vjMRy/y2vcfcPlCnrdvztje2+PHX75MQU/xClXKRx2e+ezz+CJo9grnVx1mnss33n3I6mIVTYJp\n75jFRhVV0VE0BT8OmTohhbKF46QUCjqtZhexlGFYOQyzRD5fQBRFPM+DVOJ/+tv/Bz//pc/i3HwN\nRTOpL+aZtrsEXoWJs02Q2WSuiqIFFHLLNMwGB3GPmlbFJEBI+6RBh3rtKfrOPm53RrV6BSsrcjod\noNl56nED1VwgHvZQCx7PnnuZ7nQE/pTq3GX6g4ek5vuXUj4wssyRY3ByhBLlccZj8kvnmaW3wS+R\nJCqyrpJXZCpCgb2phKZCfxCwUMkztiKG/T6+5XFyPMXMS5h2kcT0yXyQPQ/BCKiai5hmyKnroeSq\nJGbI63duIxNzNGpTmNrEYUqURRwPIsxEJRJjMkNkf+8Wb799gaVGDlSF9975HnONOWTb4sziGpJp\nkK/mcNwep9sPKdSqzC2v0rx/l4995sd57et/jK6rpJJKFKeIeCSRT7lqE7ghvusyd3aFn/oPPkca\niPRbJ5webqHqNorapHl6iKHlSAUBS1TwgxhT04hkAbVYw5R9Lp87y85BEzOnMh65FKuLlCsxk0Bm\n89KTnFsuEM1ccmqKkVNpdUPSaIhmaKQTh0O1x/ANn9X1BW7f2CJXsOh1Z3zn+j4XL5vcfe89Xnjh\nEucvnmM4GFMs6DxyO2RZjKwYnJ5OGXYnCFaOl68tYpfnefMH30HWqxRzFvghvXGGT0jeLjLw2sxl\nOggCaRo9HgUQJSJBoJwW+Xt/47f4L//Zz2JnCtuPRtg5C3Gm0wxHLNcNcmWL6dBh73iHjuJytrqI\nr6vIqcrJ6Ig55Tz98RZPXv0cbpjRGd/mpfNf5GBwl5ws8HDWxI8dLHmenHSOWB6xGFh4UUDJTSmX\nznP0owd7gQ+wN2ztOQVDFtGkmI35C5xZucKj3SZ50WNpfYlZGJDP5ZBNG3cWs9JYZjaZUioUgBym\nJtEZjHEnAtWCgZ+ITPZdDFUnCVV838WLh+RKdY5be2RRRuyBjETghfiCS+fEA1GlNwnwvZiT0YRi\nRWfUT1C1Au2TFq3jR8SzCCeYIcoK+BpxHLC5UKFgCYio9Lpt0miGKSjUFlaRkhS9PE8kaui6yHDc\nZTqeMRh28EY+BS1l7cwFPvHRZ5j22iiZTH15mUK5gjNxmWQJbpQiqzbD0xGipiBKOqpVobaySprF\n5IpVVMPAmc0QBJPJdMbS4hxRJDLt9Ljz7tt0j7usb6zyjW//CZfOnuH7P7hNfW2ZUXuELwvUyzpC\nPMF1As5u1Nne6/KzP/dXUGUPt9ND0ATyYsTZjSVu3n+Aolu88eY7pKLBfrONbeoYhsZcY471jat8\n/Ru/S14vksQJURTieA77R0coks5c5XGgvnnhCURBxg+miJKOadqossTeyS6CInK3/xpSYNBqD6lr\nBllBpFGvk9MMJEtGza9x5ex5hEzGcRISyUFyMkLBx41l9GKeTqdJv3PIYHxMZ+CSy/LsON/DdwRW\nFs8hJinOVKTtH/PU8kX2wh3udvdRSyXak3s0v+39yN6wD6yC/xv/8G+QJDnicEokmeQMhTDsEerQ\n67s0T464cHmFo6M+jnNEZzRBVDRMUcWJXOTyGKU/R3t6gmUlCMwTRw6z2QxB0UmzkLyuM5hCTs3j\nOm1KtTXcYRtDLxCoHuI2CKJCpzMht2oyr5TxA4WZ61OpLtOoSVgyHLd6zNXyHHdDLp9fI/IdapUK\ntYVVzp69xPXXvkouX8JQylTnG7QHx8TTmE77FFHJM/PGOH5APO2xc/wIIbL43I+9jBgI5Ap5LE0j\nZ1rs7T0i1jQG7S6aaqKWqkShhzfqkqkmH/70x5lMutiVEr1Wi9/+3W9wcnDE/LLJwzvHzJfz6HbM\n9sGMTz57hWbrAbNpysWLm+y0XPJmxrNXLvL6m69x+0GfMIyRZAHHmaGZBa5dnkdV4a07J2Rpyqdf\nPIsfTSlWFynlc3TGQ/qDCY/2B+RUBUnLEGKRXM7gQx9+AUmweXCwhSTLaLJPtTjPN759nfOXLxKH\nDqtrmzx77UXm5urkciXsYgPfGVEs1Hj33e/z0vOf4m/975+iN7nH6UmJkmiT5WfkpTJHpx1kS0WT\nQEstQvGYi6tP8mD7EZuXLhPtT9niPmam4ykzDDsmGhcZdATWF4p4gYRaiSmUHKJpHk22KEo17u3v\nMBmeslJZ51Q9pWBV+L2/8+6PrOB/YG5Y2E2IXYdKvYRuG4iyShKKaGaRdSXk6dIagqFg1Av8cf8m\nRlZGV03Kps5y9RKdwUMCKeVCvcJEtklCAVfIUIjwvISlfJWd/RZyZOBVDxCqJvu3TrF0nwkZ8nzM\nxAN5KmJvGqQnBgvPV0klC0OSGY0dhv2AWBOx8zajccZivUycBYydKRcunsUqFnFmA9bXzzNsHRFO\nT9jr7CNbJs3tU5bXFgn9hCCKSHwHP5WYq53n4pVzZFGMJEnkNI3scSKVmSAi+BGxVqReLHIympDK\nKa1+wPMvP8fNG++Sy7lc/84W0XjEk1c/Sjlv8Nqb71Gu2CDL7HZmXLxwFqtexHTKwJBed4RNiJQm\nnHT2aXU9pr6PrmikYoxkmNRrFpoiUVuc56+/8DIP723hOi3y9QUGk5TpbMDS2hrffu2bhH5IkBdY\nzxVRrYRISuj0h6jRESftQ5aWKpjFRZaXFpmrPqDb6WEZEaN+D3fmoyhFRDmPiMC3vvabLC6doTfz\nmAUOlaROJiSM4wjRiGl2HAIlh6WVKCwKzJfXuHPjbVKxQhKXUCSB091tfALy1jqj7kMq8+cZjA5Y\nzYckgUEwVqmtl1ksjLjr73D2bIwwzuF3ElQxQWtYRILGnFEhHL5/U8sHF7PU1xBEkSjMiKcJUTCj\nvjBP5IXoZplQlpkOZzy7eIlaYYk3bn2X7eM79E9M0t4hsRexVFtkyoh4AsN2TH7ZJAoVNs+s0+4N\nOHNmk2Gnw3gKohmzdtGi2wkQRQlRiiEK0TZtknbC6eiYBw98NjZW2OmkNOYWWN+wmcwmVKwc/U4T\nZzZCSKBYqNBu9zA1kb3jiEo+T3lumdgLMRQJVc5TN+cIo5DtQRNVlpnL5Rjg4BHhz7r4mDTKBTrD\nEUI4Y5BqiJIMYkLRVtlrn3Du0pPcvvUeTz95gaefe4LZZJVWaxvZnjDyE+79yXf48Ksf587dA3r9\nPkuLOa6u1kjoc7zT5/jYwTJhba3IbrNLlqR0OjMcx0XXdURRxtBs4miCoYqotsWd67eIfQc5HdPx\nIt74znVeuNDgyY98mj/62h+z1igzt1JltV5j6k05PWmhywLbD+5xOHQIKzKeF1PfnzI8PSbN+XR2\nT7n29JO0Rx38wCVMImxJJXAHHB+PidNjzFKRreYWpfxVfvazf5v/7B/+IkJB5enGEuNTibHTwjks\nkzbKRJbANfscB+0tZpMUac5AUHUGzT0S28b1RywV1zgdNdFlyJkTDpttOoUxolJguwVK2kJb3EW0\nAoxApzU8QupHKOb7S058YL1hURwymzj4QUyQxhTqJdqnfYadFtFkSrm4QG11iePuPmdKC3z6yU/w\nkadeYeHiKnbFwyhKhJKOm6rk8zmWzlSJIh8zZyElOdwwoTccEAkp8SSHpIZMOhFxKBA5AWIgoedV\nkknEwJ9iWTJeLD4OflOVfF6l1zploVrBNGzOXbnIykoZzcpx99ZNsmBKvzci8nwUPY8oCliGiWZU\nmXgOuqGjqjIrCw2QJAbujCxLSLSUOJaJMoGpFxKFAoXSEkEaM5lNCRPw/RgrlyeNAs4uLyPKJsc7\nd0iCHqWcTq1ao9+Jac+63L17m0q5RBTNaLdanByPmS9Z3H7UwxdiVpcK9LpdLpw7x73tHrPAJUWi\nmDNYnjN59bkVfu4Lz/PZjz/B1751nb2uz9bWKUphjnfvHFAr5Fk9c5av/eHX8UKRk14PNZoRukN0\nJeF04nFr64QbOy1awwlGGpCIIj0l4L5/wKEQcPXyBpatEqcptlUijlKi2GV/b4+Pferz1BsrVMo1\nyoUyn3/mxxAS+Cd/93Vm3YheJyJyhkiWzcJmlVXjLP6pwI392+CDbC5RV+o8MfcZWt2YYBAgRiqt\nWYeakcPLZgSWQioKBEKGoiWEZCj5IZ6/iV1aoaCUKXk2YmgRTY33tdkPbot+HGPaBnEiIlkWQZBh\nL5QZND26/QGu77N59Sxx7QLH/SMspcjTjXU0U+R05DM2Io7723hhjCSqCKFLzjQ4HPSJ3SMif4BE\nibAbUljQCT0PL4mwJA0/TiETCKUYeZSiKRqRCx/7+ItMuj5rywnLy/M0KhVCZ0aWeciGRqW2QOjc\n4sqT54mlmDBJOH/xHKZoMRj3H9d8sphioYzvzNCNApmoUK+mKKbGwJnROzxgZ7JPo15HkSVERSd0\np2Sqgq4JyBigQc4qMw0DTCFlEkzZeXRMvdEgb5gIqkl/PODcxhmuv3MfQcqoVKuPkwhTlxu3Bqwv\nzTGZTnn3boe1jTJS5vLSs+cp6ialYoV3bj5i5qW8/XDE2nyMLLSoVAsUcwXWNpdYXTuLXXhAuWIi\nKQZ2scCVpTlu3X7E8upZer02128f0B/1mTgxiZhg6hqyVibsDggzi0TMMGozetMZu80WG2vnyVIf\nTc+hKSLnL71Au71HpbyIblpIYoxSfKzpaCkiH73449zv/YBYz5CEkEnYZDZ+yKViBWnBolS6yNA5\nQRCKvLH7h/zEZ79I2OtzNGrhJR4zO6BSzdFQK8SKSUVZYRjfI0192o7HGcuk0+oR6xLdcMyZ8zad\npPe+NvuB/VkKpRqGrGNZOrqiomoKhmwyf+4sQZIgySInD3ZYqBZQtcfbToIs5Xz+LHJSIp8ILMiX\nKYl1htMJnq/hxQFFsYLvDDD8AoIRoi3qBMGM8NRCcAR6nQCjoBO6GYQxmS2TZjFaOaCiwGzYYnl9\nhdPDU46Pjtk/PmXsOMiqzJ3vfh87t8RRs0eW5tjfaXO0s8eXv/w7tI/7yKZKkiSkmYht2oi6QppB\npVimnLNZKVdYrNZIk8fTmU4YIisa/cgndHySOCMWRAYTh8ls9KcZvZhzFzbw4wwQkTWdh+99nyAO\nOGx1WVxusLmxiGXYuK5Lfa6InyjsNgfMvBTElNbxAM/rs3+4z25zyO2bjxgOR/hjl435iIsXl4js\nOgvVHC9dqqPIIU5nRFGJeersGvV8HtvQKOkShmFw0mwhyglzlRKeLyBKGcvL85QWdJgmtE89TE3H\nKKlERzKIIstLZ4njmHa3Tbu1w3HzPfqdfQ527uMHXcbDY+LIZzIZMZn2GY0H/MJnfpW/9vEv8NGL\nv4xuyuDrvPXuayy9+DTnG2fpDPdQJAPV1vnch1/lu29/jQftO2i5FVZXFdbXbMphA9GyyBVEJpMu\nFWMNUxeIXY07k9dRU4iHE559pcxBe0Tv4fsnvD4wsuRyBaq1eaQUippJvVjEMm00BDafuMrYj0l0\nk9HRY8npXK6IblTxpwGvPvlZzGyF2HepKiWeOfsUsjJFcm0iccKglWJbeUzXZhbOsAsZuh6RCiCZ\nKXECYpZRz9VwOgmFlSpqkCMVE1762FM8ur9Lq7VNlgjMN3KguYyGDmm5RPe0yZPXnmZpc51MlREl\ngY0LBSTbRpRFbFUnI2Q0G6HLCqVqA8+L0BWL2XhKGCcYskz7aBfikESUCdyInigwmkWEaUK90cCZ\nzVBkid5kxO72Fh966RqmnnDjrW8TxgavvPJReicDBv0BWw8O6A8GmKZO67DLYBwx8z3GIwddsyga\nBcb9mGFHZP/gmDAR2Fhf5xd+5tNElHjj7dvIoy5xEPLm/T1ev7FLOa9SX8hx1B3z7t0HKFqe3NwS\nVUPGKMQcNkcomo3nu2TA8UkLJVM47R5jqhYTPyEeZsSewOnhgF5nwJnzGzz7xCc4fnSHu7eOSCWZ\nUm0BQzOwDJ04CJh5LqqSQxJENCPjf/kHf5+bW7/BzzzxCi9e+UXickbzweu80f4Kx/4bnAQ3wTvk\n1vXv8eFrlymvSUTqPdr7Kc2DIaLlEngjHG9MgIDrS/hiiDmr8eql59kdHyFKIt/5ShPbk1hde38x\now+MLO3jQwLfwbTzuG5AFicUFJmcZJFFKaVqAWcwYjSdEU4GCIJI4nmY+RI793a4euECS41LWGqZ\ndv8BhlohIyVLc2i5kJPeIY/u7xMdJESZxdL6BZJYRJJM4iRATDOmsxFpEiIlGXPzFZRkxI039mh3\nB9QaNUp1i2JBIa9YZFmElBRYPnMNz5nyxvdu8uwzFxl7Htu7GQePDun/qdjrbDpDiFNmkc9o1EWR\nVaIkplSf48KZ86wurRM4Izxnhh94iPki+ZyKICVEUcBsMiYnK3RP2zjTMbol0D5+QDCd0jpu0e6N\nuPG9uzzxzFVUzcDxXAI/RNcFKnbCFz/1ImeWlpibzz1eTihE6PUy65fmeeraeT79yU2evlLhD77z\nLqsXNsgkEamSZ+oFTMdQ0HNs7x/w8Vc+SRym9HtTrp1d59H2Dk+++BRBWqBeX+UH128+FnZNMzYu\nztNuDUEQ8Gce/aMWB9tdhmMHRTbw4wH3T9/iq9/8CueeeZHnX3wOSzdZWFxGFDX6nSbbD18np2oc\n7N/Ddz28wQBtOc+D8Yj/+su/S/vha/wXP/n3GWYDPvvcL1CILnJp+UMMprsEdkZvNGDkuGhqQn0x\nh53puJLLbnOHLDvBMDzGkx0urz0FjQP23vOolyXEVGX5koUrmTiD6fva7AdWlPypL/4Ynj8jSVNk\nWSTwPRRZR8oEhDjFsCu4YQCyj+86VPMyPjKN2hxBNERXdDItYmf3PlEGk7GPnChIYUaMj1kykbMM\nyzaxcyLTwZjGgornZY+3pqcJYt1GS3M43REfeeoi455LccGm34t47ulrDPpD3rl9G12R0RWRB1tH\n1JfLhJHIfLXE9sEBVy4+RxCOOLdR52jvgFKhxv7olOX6As5wRCJq6AUbU7OoW3mUfJ7Npy/y/Mde\n4fzVp2ifHBF5M4LEJYsUssTHUCxSUqa+S3vYYdYbcPnqZbxZwHGvwzNPP0M/mXL10hle++4bKOrj\nhESnM0BRJW4/OGI88QnjmGbbwTRUIldktWHhZik3buzjewEFS6CkZ5TKBt1TH1O3KZQNJEHC1n2G\nwympLzK3Ok+5Uefk0TZBBnP1BnN1m0KjytbuIXoOnEFI7CdYJZPZ1EdQMuY2ighiRqgk5BSLcTvi\n53/25xiftFhYWUZOPLLIwwtjEmRyxSKD8YTG3Dpx4rC0fJlKcY7bW4/oHDc5PBkxHj/ir736n/M/\n/sb/StEe41WbXLCf5Z2b1zmYdlGdjDgpMQl7FJV10jBkOO1Rm2vQO9VJQ4lJMiFKVExZoZzPkyk5\nqJURlYwkTdh/o/8Xa2HFbDJDEUKUvIgfxeiKiB+6IGjERKRuSqVYebzMWnLpncxY3Gywf3xAKT/P\neDxGSS3MqgZTGAx62HkXISxREKvkizLtrogve4xGEcXCAq2TAbUFA0nJ6LZiDEMlEGLkvEAYegxc\niQtxQrd3wne+9w62LVJWFijVllBUiedfqNLrdCjZeSa+S96yefud13j2iau8d+8ms6nORuSiajqS\nLGIUy8RhStEsIkkp29vfp7K0QKk4T6t7gq5Z5PI624cHaEbGaOCiW0XEsIMbp3TGPe69e5NXv/CT\neLOAQmOBn/rC57jx/Rs8c/Eiv/PbX6U7GCNKEsMuZEJCXtFJsoiZ5xOMHqsql02Ve/e3mI1z+Eio\nekbJiOmHOt96/Q6Vcp6SbbN93GV5scBC0eDoxEeSDlncPMdo1CH06txtD3g2b/H6dpPj7v/J3HvG\nbJbe532/0895ztP728v02ZmdmS3kcpe7yyaKNElZVERFsSUIlmVEsWEpsYPEDhwjkQPbcRocJHKg\nAsG2nEgWJaqQFHvZJXeXM1tmp73T3/4+vZ7e82FXRsJoERr8sLq/nIIb59N14b5x/vf/d3XpTjzy\nZQnPEYgjl9TQmc1cCmUFTS7iRT6KrlGsb/BTP/lzLOsqBwc7mMTsPbiCIqgMj25w+tKPoxoyarHJ\nymqR0HdJAp8/+Lf/O5sna3zmI7/InY17PHPxORJRpN6fjwAAIABJREFU4F++8Kv8V7/831GQa3zx\n2m9x3f4GKxsSx8/8DNe+9XVWF1fodgT6+j2K2RkuHlvilnuZar2MWltEEt6GkPs+k6SEoStIfZtM\nsHBD5x01+wNV8AVB2AHmvMXciLIse48gCFXgd4E13uYdZ1k2fXv+3wd+/u35v5Rl2Ve+73vZb/7z\nX0HMBDIBNEElEhNUItZPPMLcDYj9EAGQcjK+ZxHZU+qlAoJZxXEtdvfvoddLHHUOmYc+M3+CLAuM\n3UPswZheBKakYjkuWaISRR7lhTpHt7ZZWlkhGkVYko/ckFlWahTGKmvHiixvPIskZww6Y4ycTpb6\nRGFAZzikuz2i3CpQa1UoanlcZ0I9XyZfKOHGGYIUc7DXo95co1lqsFYpcTgc0Gy36fW6vHT1W6y3\nK7z3uecwGw1kDA4PdghSjXJZ5Dvfep3RUZdPfvpDCKlOpdZC0BS8+T6el3K0f0ixWkdKY373X/02\ntztj9jojiDMkUSRn6jTLGv2xi6DDs489wqvX7+I7PmGSoSNjlmIUKYeiQqVgcDiKKJsySZqQBAKK\nqb1FwY9goWXgzn1WVxrEYUZzZQEvzLi3e5ebD3voqoAoiRhFE3s+o1Vr4EYemRwhpjGRILO+cZZz\nG2d47uxZWssbDDr7zKYH5PMVssjmja/8MetPf4zN1iJme4WFxZO4tsXR/psYZp3h8JDF5WM0Wm2S\nRETIfHJqgdeuX6G9uMSd3nUu3/gqNx9eJ9fUWWyv4cc2vlclFWwsex8/FtH1kDAKUM0pilBmuR5y\n506V5YUKbmZiphGCGFBXF/knv/gHP1QFPwM+8Dbj+M/Gn0Xl/TNBEP7Lt5//3vdF5S0BXxME4WSW\nZf/PMCSEVERWZaIoJFZVMscikhX29m/SWjhOHImQJiRhiCypHA165HMGJcnGlgJEBSQn4OXb38FP\nUtrNEg/vd2m0GsgLCyi9AeOhT0JMGos0axUGewPEpszMdkkEnyySyayEycjlmR+9RL5SYvveTYxc\nkUqtjSyJjIceqgx5vYBamdLf2yavy7y5d4f1MyfIlQoMR0NUSWTc81FVjd7REa7jc+/WFk88donJ\ncMS1m1cxiyUcHyYji/5wwuaxVZzpgGI1j+6rPPfYCV5Ox/TmEboU0ntzi8D16I36bJ66QBjavPKN\n6/zx179N4KtcOLfCzIrpHXUQRIEgdCkoNQQho6yqHNx/wMVjK9x4cMgTZ5cx2mVu3djh6HDKYslg\n5gqUchKqoqJrOWIvwskiREkir0KtUmRuz/E8n+t3OvzMxYt0hkfsdGYYOYksESm2ymSBh2hKHBwO\nUXICuqGTBDI5Q2Z18QwffvISgmSQpCGikkORcgh6jslgTNx6hMff96MgyGSBj+vbZCJIegklV6NS\njYiCgPl0hGoW8K0h9w5fp1xoY5p16vl13vPYJzh97nle6v4vGNEqhjynM7+LKJqQKyDbM0Yjj2Cg\nsrS6RGbUuP5gl1Z+EcHWSQ2X4dijUs8z1Lbf0QT/Ptuw73faDxWVJ2QyuUKZ7v4+DbPAVNTRFJc0\nSJnNhqhKmSiNSKMEydDIN5YZ9w7R9E00PY+pyBx1t1ldWuFo2qUzmhCLAkdHA+p1kzgJyJkGpmJg\nBQKDzhDZCGnnz9Gd3EXWCqRuynTocPzsGqvtRbZ2HxJ5PqPhnFp7E12FTmcbIc1jRR3ycoPyZp6Z\n5/LsJz6Df3SLoqYjFassL51APC0w8QK++Me/B2qRydxh7rlkkY8T+1y7fJtWPceot82Js+e5efUa\neqHCc6tn0CpVBttdKgsnSGOLgqJhtE7R7z8k7wXoqkLPdlk6fpLu732RZk3nta1tYs8nFTNUSSGJ\nIw76XUg1pnaKtFhmcOs2QSxx82DMo0WV2WBOTpcx0KiUTfw0RZIUJDWBJMUbzVEQeezCCfRSnSef\nepo7N2/zeM7E86Z85TuXybIQVStQbBjMZxZCpCBFKoLkoegGYjHF1A0K5QI/8cGPk4RjNEXBzJco\nFutIGyfY2bvHY898krMX56i6zr3X3sCoSWwWL1FuL/LwwQ2qtTx3Ht5GM3Isrx5jsbGGoSlMxwc4\nkUM5TFhd2qBoNVDQePHGb3MzuIwYxpCMMMoNjqk5Jrk8KgpTx+fefZv6WkQlqxDU75CEBcKZQaja\nDJ0eJafxjgb4Qf+GZby1QrwqCMLfePvdDxWVp2gFNFWj0qzjhTK1egkvyOM7Mqnv49pTRDElCGLS\nJERIUw4HAUkE+byC7QVEsUMY2mSCTCpEICkUmxXGaYwbq0CKki8hVCTMpkKQ5ZgIE9KcR3VDQarH\nNDfKmKKAk8ywwphnn32Sc6dbnF0pc+X111nYXCOnJ0ixznTSwfZSfD/k9Ze/wngwQMrnqS+2yOd1\nJtM+K80GP/8f/TVqUkick+kcDvC8kG98+wrjkcP1O33udAPOP/okH/xLP80Hf+RjTPtHpIHL4mKL\nJx5/hmZ9mVmoMj3cpVIsU24usXVvm1evvAHJBE3SIIPZyEIxDArFIqKUoKjyW4RLWSYJRRKthl4o\n0ViqEFkemevwgfdfZGGxjqjI2FmM67msNAukjoyi65SqNRxfIIsDNEnm5o1r3H3wkI3N43z78hvM\nxw56uULzWAVZlyjUc3jZHCl0uHhxEaNgUNcXqCh5/pv/+L+nUtRYXlqhXWsiIeBaA1RNgQw0VWQ2\n6TPuDjh+8RyXLn2M1uIxyvkaM9vla1/5Gpdfu8f9mze5dvMao2GHJJMpFRsYacx4ekjk+jTyJaJ4\nzNn1HwfPwwq7fOz5X+L8yac5OhxwZM3IpBbnz5/n2FN5FlcWaJ84gWeFCLGAgEx/5FLMTuKL3vdL\n9d+NH3RleSbLso4gCA3gq4Ig3P5/OSnLMkEQ/r2i8j77xS9QzBs4gUM9V+CZ595Ho1LCdiBXrONa\nfex5gGbksIYWuVKOYqvEYLBLsX4BQ9RBVJg5Nq6fkKYx5XqV8aSDFMqIkYAb24R+yMwNyOsC1aaB\noukIWZPugzFCLBGWp9zzQyqvi0yEHv/Hr7+Gnitz7JGP8qEP/CiXr76AF2XkSiVypoyoKzy4Nef8\n44+xuFohDV3K5VWCMOTsmfNkgoynaDz25PMcfutL5PMKnj/HrNRQpICT56ucW38S1x5jFotcf+kK\nJ594BFlqkAged29tsVivYrbajDOP3bv3aK0s8cjZ81QrNaaTKX/zr/0V7NGEoTNCNUNu3zji2pZH\nEHsoko5uKGi6hlEucHSvi2rJZFnIi9+x8L2HNFZKPPrIMkKc0NxYoL20zK2dK0iew+Zamx959lH2\nH3Z4sH2H/sRjbW2Vl25d5T3nz2LrPqoeM+gdQc2gZixx7rl1JCdi696bmNoSdtrjb3/mH5BPE0q1\nBrE3RlVzbyUVTy0C+4jNlVWi0GZh8xzu8CYyZURJRZIzsgw+9em/Su/okCj+GL/xa/+c0/UaSexi\nzTWUYhPBCxGThO0HdznzyBM87OyzdeNz6JoFus8rO7/LqcUPkRgOp0vv4c7e17g9qlHQarhuxsj6\nDvOOidjcZ3dLZ9IN6GQv0Vx8Z8jeD2SWLMs6b18HgiB8jre2VT9UVN5/+OOfpFbO43kxu8Mx/syi\nsFREE1WyOKLePEF/sst4f4/C0iqhZ6PLeUaTESvemDgNCGIfOYJ8IqHUN7n/4DZhJCEFFqQVRFMk\nwWOhUMCOXVKgs9NF1mNaZ9rM9kfIsoRYTTmIHTY2z/H4+eOkCVy9+gp5Q8ceBvQGXabWjIIos3bi\nNGvrTXrdHXJKxvkPXcRUy2SZh2vPUEoNothHIMaslZBzBfY7ByxUK9gEqPoaZ8+dwEHl1je/TH1h\nmWtXrpCrH1FRSxQrYJZLTPsdark2wWqGnqui5QsstmFp8RTj/k2OogmNjZPsDx9w+tF12s0ir291\n8e05XgQ5HRQ5xndD1KKMbtZw0zErj20y2r/J9y7b6KbOyTUPJZfjk88/ym/9zhcZzqbc3OmxuFwn\nDAwqRQk/8uiNxnwtCzEbebzRHMnI0V5s07vzEM8xOffkGaqugZD6/Bef+cfIYkaczrBHPmkWk6Uu\ncyug2lhH12VEQSCIVObDHZY2nyCTVOLMw7Nd0pyALIm0W8uoqsLf+/v/iMlkTre3Rd5cJksm7O3c\nRTfq1BZX8a0Ow+2rrOee5mHcJ/bfxOm57Ha/RBQ1uT3aQsrVKOt5JHmEGGxSrh4jVjQUucqzP3qc\nyAnZHd3BUE5w9Y8Pvl+uwA+wDRMEIScIQuHtexP4KHCdtyLxfu7tad8flffTgiCogiBs8A5ReUkQ\nkkkSGREKMhM7QdZkMjnB80FSI4RYprZ2nEl/h9BN0HMysgp7u0eARBQE6MUCuXqOKPXQSjq1cglB\nK2HqKUooIgt5KqGMLmpYgxChKJI6EVZniqQJeFFI756LpAjsbO/wf335d3n59lXevPMKS2tlNF3C\ntRySOCbXrHOw/4DtvTGnz1zi5Omz3H3zCm7ooogSiq6haQb55gKjucVKoUmShex0DuhPxmwuLvD8\nxU1Gu29SVUSyLCGe7VISYxZKBt3uHYazEW9+77vYUxetWuLg8Ag/CiiVFzDyZQxDRpFKVNptFAkW\n6suUGg02Nts8/76zPHbpNMfO1mhuFpHyGraXIsgiYTpn4ewpBgeHRGGZue/TH9p4acC93R3+53/x\nh2SKhlkss9Iqs7dvUa8WKVdavNHbBVUhh487mVNrVVhZruMfDWmt1jj3+KP0DjvUa6scr5+j191D\njS2i0EdURYqFEknicNS/TxCMsa2ALFOJkpBiocxs3qF/sM3R/n1s16N78ADbHgMxoiAiihKyqKAq\nBWQlI/YDFpY2mDtzbm19j6vXrnBl+kU+/uxfwfNu4w9SvB7MnIj9gz0yb0S1tMHGwgVq+kVEs4Bh\n5DmxUUM3BR6OrjNNtjhzJsff+dFfeEcv/CArSwv4nCAIfzb/32RZ9hVBEF7lh4jKC32HKKqQhDGF\nvM5wGDA87FFq1AnmPuOhRb5UxfcsSvUWzmiEHCuQSIx7A/R2iak/ZawI1ESJ/cEYTTZJdQEtLxPZ\nIaGiE7kZnXKE4EM7l2OScxCp4LtzivISWTZnbk3ZG3TJejquGbK1d5swjfjNf/vrfPiJ9+Onx3j9\nyi18a8Z4ZNNaaXL39i2kzGM2PKR97FFyRg01ZyLJOkngo5s6+aSMZR/RsUY896Gnyew5t+9cxpFM\novg1ps6ENIbtmztMv/gin/jUj7Cztcf5i09hDQbs7u/TWllFVSVm3VtIeo1edxetpLNUf4z7e9uc\nO3saNReSkw2+9Cd/RCFfJLk/QKrUCMWEk0+sMT+wqTYXkWsxagcsP0KRdFIh4taWDbGLrEhomkIc\nSRyOQ546f4x9r4eja6w063T9OUEaUK2WGfkD3H7E+y4dpzNL2O3fwnJE8mlG3cjTLhS4u3WdxaVF\nysUKqZBh23PqpTJxomGWSljWEG8yQM/nUXMN8nqKUSrTaLa5d+tVYn9Cub0Ob5+zGw8nvPrK16jX\niiSxzWAwRFFN7h7cwZJm6MIlvvDNf80sOyQIiuRzb50nbLTz1GsC+fwRR8MjlnJnGFj3ycnHif0x\nJW2Bil5FlQ+4/71d+mvvHDnx/2uWLMu2gYt/zvsfKiovSRPiKMV2bFwBmssrjLsDcqUM5JQwFslr\nKZOBhyBpJEKOmT1FkiS8yGGlcYZMyki9GTuZSF40sDKP+U7A8UvrTCZD8FKsXIIgSQQ4iKJEQaoS\nSj5SwyBnyMwfeNRONAiiGWIpZrHVJEx97r8+5lAP+cKrL/Dkxik+/PGn2X64zXKpBamPM5vy0vde\nYbndxAtCBFEhiSLspIui5hBVBUV6C5r99MWnQDhgEIV8+NSHuLf/GuPxlEwBxSihHq/ysZUTxJ5F\nSdLRoj7XH9zm/R/8GDdvXaXvzilVS9SX8kiiTBwJZJrD0mKVF1/4PE+95yPcG90gTsbUqlWspTXc\neUZaddhsLXHXvYdUHpLXSsRqAb0h4rkOSmLQXjKxrYxCtYAUa6hljcj1ePXhfTw1YdzdppSW0Rd0\noknEzB2jxwpyyeDGvS6ylJHKEYov8aFLP8mjCxvkSjlWtUtE/pj7918nFXRUrUxOE3F9C/telzCI\nmPZ3UQt1rOmrbF54kgsb5xAFBVU1SDKbwPeAMbpWQMDixz71s/zqb/wDmgunOBgfIGg5EjVhmAyp\nlh2caZFc7jEeeeQJjNUDiPbZ35lSWr5PXtVoNmoYyYjR7RFzzcEXZ6y12vRHCc9c+jGq1ibOg9vv\nqNl3j0gZRPiuQ5JlkCgYukSYFwl8myAKKIki83FG6EfkS3kEOUHwElJVJs4kHGuIFbkImkgiCjhu\nhG5oFB6pctTtQeqhZSqCoiDGPpmlIDd1uv0ux9dPMpHvMJ4MkEMTYTokkOsE0zn9u9sc/+BJimfH\nlNQ8g52Qr3zjNp/88RbPPPUMN27eIEolDgZDaqUie3uHtFstNNMk8UJ0VSbKJGoLDVzHQRFUYISb\nxpxceYx9q8/64hJz28fa3Wd0FLD2WJsXL3+X5fYiadbm8IVXWd08Rb+3j+/MmXfv4oTHqa7IKGqe\ne93r9LZ2aGmbPOzepH3Q5uuvfg5TN4ms+6xoTeKcQqKFxKnN8hMtRDvg6JUOrfUaF44f5/aDOwgO\nrJzfoORLPLT3KOgKW7fuc5BFFEsmWgmUqIKWJdSqKq5apDPsIRkGWjKjvFBEzGQGAxdBkHnhxW9j\nvl9Bvz+ludxE0vLs7Yw5deo4h4NdTNmgodaIQ5tBf8Ty4gpLm08iiQ63rvwpw8VjFIs1ZE1HlZu4\n04c4ahlVnSLLMtuH93n2gz/LC1/7HQ7SMUUx49rWmxQ3ZTLJZlqSKXkKRSXP3hsy4/KcwdYBK5PT\nPDS3Ob7WZSzvU9VXGYVz3J6JubDJpy9+gqVGg6wsM+j331Gz75pZoiQgDCziMMYNA0QpQRANrKmD\nLEuQT5mPbSTdRFBTTNWgP3cwNAndrOAFEZpSplhaYGZtoQtglMpMp12URhEe+FRLRaZ7A9bPrhMt\nuUz6PkWjyiQYkYY1vGBAYbWF3S2gFiWKzSK1RxYIlYwTC2e4fe0mxZNFRs6cz3/2TY6f7HHu8Uu8\n8p3XSMQCK2ef5omLj9JcWiH1XCRZQ9ZNpEzFmri4dhcvDAiVPsFIZCDdxp+H5CrHkBhR09aQlw9o\nOybaxY/gpxbrBZk4XiRfLuBYh5RqZXzpNEkQsXPnCp39A8xNmV3V5s3BV9DKItZ8zJE146nWadLY\n50C1yIZTgnxKuSgSizl2r+xhxRlrawq/9+Vv84lPPsdk54i9oz3iLCVWZIZZzEHi0a4tsXx+md79\nQ0RlxLyvoJR8wjRjoV5CiGMG84SipTGcjUjVDK2kEscyH//wR/n93/9tKlKdrdde5N7uPked+5w6\n8zheHKJpCfNhiOt5CGqN/sFdXn7xy4SRD8bnWWivUq0uMxzvEsQJ9UaJXK6ILIucPn2RyXzKpV/6\nx+wN55hqyG+9/L9y4/CzeJFM013BClO8+TZ5VaF3bURBamHZfZ6qvY/Ld15nc13j5NlNvvD5WxTa\nBlceHHHBnHDj7pAfee5ZyvW/gM1fUeyRJTVSISEIXFIhJZz7NJp1nNhCHAYUCxUEEtx5SGiFmMUC\njjPAsVxqK2XIyYiORVXLISznScK3WMKCauM0CszSCYWnQkQtxup62M4MqSghhinzeUJ7pU7qadTz\nBjfv3CP/+GmixCTRbPphn9FOyL/4b38NfsZC8kOCOEWSND763scpFMvEgUsqSkSTGYIqQSIQhz6q\noZFmPm/ub6ErMqPZjHJtiUm0TzMQUZaOYfUgUI/ISRqYb/XI73UGnF14H2KuQKtYI2w1SSKf88UG\nWSoxcWw6mcW3Xv0KJDmSkkx2FPLl3a+CJjEbh4yqA9bma7xeehkzzGPlm2jRlPrzx1B7PboTkfOP\nraCbUyIzh58ITPwJWhTQ308onVwknkNn2COpzqgVF6kVxqi5Bv60S8cN+OijH+XW/p+S+D7H10xe\nfWOOUjyipit89Uv/J6GUoMoRlUYbreuCkafbmXDsZJN71y4zsWIarRYQUTvxGM4Xfp+5M+CbL4Vc\nOOYiqdusHztBqVwmi0Mce0g+XyFJE0pmkSSNaVUk0rjEpy/8NPd7r2G5IyaxDW7ANekFFJYIZBOp\n5uLPROx5TDYxGRY75FZE6ssKUtjmyZMf5D1PfoDBnVt0D/Z5uH3tHTX77pklTHBCiwiRxI8RsgjJ\nVBEMifDIZ0qKmjPR8xKzwyn5YpU4mmEoJaSGxqBzROzDMB/S3e+T1zOEioQQSkSOzNywiQKPcmCS\nFQUiTWDz0iaT7pggCxCmLq6lUat7zLKQymoR1w8w6yrTgxElbYGXP/857NkeTqaCoZBHIQojMiHB\nGo2IySiZNQRVZa97wNrycRzHIcHHjSALE2QjYZaFDPZvk9ZzrNaWef3aNyim6xyKPfpjid3eFNUw\nOX/mJO58RCr0ILJJ4hAhsJjPuuhmBdPQeP+5J4mlMd+6+l3ycZHtvQExIgubJncnd9BnCd8TDxAM\nBV0pMux3aOo55oN9ZFUmk8eo1SW+/dp9agtV5LyC6Rjk0yqiOufYiWPcP7qOJEg4vkWtfJ6BlxL6\nfSIDjlfavNr5GitqhevWgHrhSZbO3iUvNRgOXV4dfo/pgcaxWotX3niNYqVBfzplHPfpTYZkcUC7\nVuVbX3uZk2cfhRf/CDeFQFfJJxmvXvkup46f5p4/IW+U0HImm2fOQZIiSKDrBUgcAi9BzRUoFFf5\n68/9T/ydf/Rpjl3wsCcVNDNH0RT4yx/+Bcx8C2e+w7du/wHjESi5R7hh3yVvNDi7+AQfePRxsjDC\nTWWmg312e9o7avZdM4uYCWRpjGfHOKFHXtGISXACF4EUP46IQhfJF8gyCc3MEXseWQrlQpHu3KFs\nlBh4HqZpIoQWgS8R2BFO30cjJVfTcHcihDRkOpwTjAJ0SUes1JGjfUaDAXq+TaO1hhwf0B/N8JY0\n1Ejhn/7yP+Rg/zpSmmDUl0kcl1SAQmWBSecIHyiVanhBROYmRKFInImUC0XG1pRIznBdA0faQtcz\nthOL4x0VS5Ww7gRMitdgqcLGegujNyZLIx5c38LQSuQbBlQl4jShmm+QJRF+NEeSinR725TkGpVi\njd5khlSu8tjj57hz9wpx6uL7IpKekUkZdm5KMV/hKH2r+7AqNzlZr3F1eoixohCnFhWxiV6Imff6\nhLbH9777Eu2VChtnT3PU6zDZ3cadexQMHdeeI2kehYUz9Jw9Ti+9B88fUdDhcLePHMjsbGU8dmYT\nK5FZXFzj6PAhkiTQmUdI0ZTHn3mSu9sPGU9T3vijr4OasNYq4HgJm63j1DYqHE1mGM6MSHhIvbiB\nlqshSSGyLGMaBmE4othYw41jBKPEUiHH//YrX+fX/uTvsna+xK07NvFKQD4T8MZD3NmMa4M75BcV\nqq0nOZzM8Sch5953niAZEDsidjAmE2Munqq/o2bfNbMkGcRhwNSyif2IyNBI5gKpkhGmHkkUE8Yx\ngh8TZDGyEqOnBsPJDNXME0YJvemIRIupFBc4HO6jBR6BnyA4IgtPLzPr7THPZYj+mEJS5OSFRaax\nQ9aLKD+yie9E9IZ7yI6IdaDw3vdvIsdV/vNf+ZvceP06nfEcIZYxKzaPnL+EkDioikKp2aSYJPSH\n+8hphpqLMMyUhzuvc+78B1DDjPHwAbO4T88ZMLMTNEWnWl/gzu4NPD1BLahYTg+/P6C5WEHJQp49\n8xG2Hjxgikxv6y5pGDCYHbCwtk5NaDLD4pknPkBrqczU6vPV0bdZWBfY2XsNRUlZO7FK92CEWlIo\nKxHjUMabWei+Rmn9Eu7gIXdGc4K5Q9yO8I0cJS9ArgiU6ovIhQlLWkSnJ/Lg8n1U3UVQl0GO8DOX\n5dVFnEgkdafElolZryHJDqJXY6YbdIdTcBPsJZ/XH2wxPDjgYDzFSCRmqctCqY5t9RjObLYe7iKq\nEqIbY1VKHI6nrG1GHA3mnGisUSjVmIZjYlEkERWmoxFxEiDFAb6Y0gh1iuUEzQ+4O9lGjGV+5pm/\ny7V7LzJpPSSZ7nB163VyepNyroa9ZRDWIWr0Ob/2l7jq/Gv2776B0c5TWWiQSSoLS5u8cf3Nd9Ts\nu0h3sQlin8DziLOEILJBTPGCGN9zmVsuWRoShwJxkDLt26RxwmxmE8cx7eYmai3HNJsiqAnVSp7Q\n0Sg2Tc5+8DwHg7skhQQtERnvOyRCyM0bb2J3XQ6GfWYTC7MmIcQikRwitAS2Hoz5Gz/289y9cYtr\nb24xndg4jsf+7g4vfvPbmMUWyduHp1NJpFxsYdYaiLkag/EuxVyTr37ld7Dmu9x48BLtSh1BMwgn\nNrEd8YZ3G78iY+s+SRYyn/mMJI83pkfcPDjiq1dfYJC6WNaUrWCHI8/C2DxF/2EHS52xsbjKoDdi\n++5l5pOIoraBjE4wDsgSEeegy2algTov4U4UarJBvVkhLPbobT1gPg1Ze2QBpSUhaTGGDa40I50X\nUIWIjcYSo8BBk1KkSo6LJ9/L2UUTa2qjmiXqukEzENk7GPDXP/rz/MRjn8KfpNw77DOdTlGLBZS6\nwHdvvcr+3ha1Ew3m8xGuMqV0PMee3ePqwV26wy5BLOG4c+w45cHuIc89/wwHRw5GIY8T+hwOXkAL\nbLqjIZZnk9PzmIpOpClv/aZPIsLAY6v7BjeuvEHfTegND3j/Ex/lb330F5jZLbYme+z191g7ucHT\nH3w/J5YuceUP7/DiN/8Ntidwb36IN+hx7daXiJOMh70OJ0+cfEfNvmudks8/+SiJIDPsD0CVscdz\noixFUXNE9pg0k1CUDEWSmc1nGJqGCEwdC1WQ0aoqveE9xpHLaGghyioCMmu1RzgY3GdNW8IfS0Sm\nQr5ivlWjkQVa7RZuGlEpLeAGNpKQYncd5CB9nISSAAAgAElEQVRB0lWWsjx/9PkvcfTgDapLZ7Bj\nKJbyjMcdRr0Bq2srvPbyt5BlFUXTSFOJOB7zsL+DJusMJ2N+9Qv/FG845UA5Yni3j1IycPenRH5C\nppsIYorvg1QuImYVorlDJqqMPZtEi3lwuEW7tspecMSDuwckesKwOyOXyHipz2Dc5dWDN6m2NvEj\nH6mRkG+1KTRzOBOXsxunWD1xnMPxdXQkcmadQPKoNRbY2x+x2FJw/QzbS1jQC4ztbRZX3sv25HuY\nioxSrOL0u+y6bzJ3E+q1JeLMJRYT7h/22NDOcP7YCpXWEqfXn+A73/0OzjggclyMcouP/9xzvPKN\nWxw+7LBxeoOFC+vghmhKwnh/ymTmUyga5DaLmEWTyprBtddvU6qoTJwB5BK8Ysx0GqNkMcVSiVK9\nQXfUpb2wQrHSII0ddqyb3Lt8SH0TcppGfWGNVbOCIMF/8PzHeeLkBT731c+CO+Xr33iRXEumuW4i\n5VTWy+eYTxPOrD+HuPIenr/wNMvNFfScyW/8xr/8czsl3zWzfODpM3R6M+wgpFAwSMKITAQFkUyI\nmY1mlCol8kWJ0dhDNiBIE5IoQ9be4o5FeokH+/fQJxnNQolADqlWm5SVBKGaQxnPiRNQFIm8IBBI\nETOvT41llhSB7niEF0wJfYn2sRqLfptb1x4wtAOGk5Q337jMoHODVnuRemud+WzMZNDjc3/yWYql\nGtVyjd5gn95gyP3DOxSEEr/3ym9TpMCcOUf9MYKksGg2OdqesfTUEoai0t0eYM0sQt8mEmzknIoo\ngZzXmEwHNM0FFEUhNUPWT5zAOZzgpC6v3R2z3d9jnOuRmytkbg9XkdDFIkUtYz4ZsLHxI3RnV4g9\ngdBTGaYdmgUFUVGRxRFCDNWySaBGFGshiSZyvPkk2/uvs9Q6xeHhEZISk1tSUBMVX/FwsrtkNMmZ\nJl404D/58V/gmQ//FJHd472X3kuj0uL+7CGSkBFKLj4OxtihdKzEuUffz2T6kL5vo4kVktQjv1Ak\np+TIqSn1c6sUWglCYJBoAr4bMvH3ufD8++hdvUlpJY/tBAiejevOGfoWV998gc2VBXbHHrvWy3z7\n5deQ9JiiAN3BDmN7RLNUQNXLlAoav//iKyyvbrJo1Pn4pb9M1NU5kTvGX/34T2IoPucX18lIyRk6\nX/vTf8WXvvq9v1is4//6P/00tx/MkESFSs3EntoUKiZZIiDLGd48ZmWjTbOSY/9gSr5oosgJfqRQ\nKuvkSiZjMq4fvYFPiN07Qiktsrt9m4986gO88sJLFOQckSSTJiMmDCmqdSS5hJFv4HsB9mRGQoQY\nZJhlCN+UiLMYJZUIErBGMxIpRhUM1jeaSIisn3yUuw9u8ZM/8ZMMh0OSxOfq0VeBIkWhxhuHW7jW\nhJxeQliwUEON3gOH2mKJw7tdcoqOWDTxhjMKbRVjs0Q5K9N3+gizALlmUDMMRpMxUSixUW+Rq9RA\n1Wkvtjk63OGw1+XejftcOrbO2LKorC9gVlUm0z7lVEROSkQFEcsbIWs+unQMKbAQ6hH9joMuqGSq\nQDyxEfSMABXmFmalhOPPkFFZaDTpOz1kz2T1+Gm627cgqTDtdyhHBZZWqvztX/wfaFTaXLv8JWJV\n5He+9Fl6Qh8lNlkumaAo9D0b767NWO9SzldYWTzNzRtXECKFxJD5xEc+w+99+ze5dOkUOzd9jKZE\nTp1x71ZIy5Q4tdDEephSblc5tXSaL9/4Du89fpzNWoMvHP0hC+r7uWFdxvRrXGyeImlLZIHLEwsX\n6dsJdgDrayuUqyb1fJVUzREc7XPz1mUWTzxGMJ2Syho5XSbWwJ5M+Kmf+Ft/bqfku2aWf/if/TTX\nH/ZRgWajDpKMHVjoah45shmPHVZWG2wst9jr2GiKTqEi0B+7VAsmhUYBG48bk5fZcW6R+GVy+gqZ\nbhH7EtJYxjcjxrctjJxKVonZNJtYRZcF8zSDyS6apuH6IfOhhWRKZEzpfV2g3iySOgKpmjDpzomU\niFzBpL3WwMwUFlZWyFwXxTS4M7zBqVMX6IQP2bs1QFdVRhOX+nrM9ms2jdM1RnctzLZCdD/CjeHs\ncwtM7RmWPaVxeoF0luDMEnKlGE3R6Q5szCxFFE268xkVSaV97hijvW3KtQJu1MMay6y3TjK3jyi1\ny4yTPtEk4cTKeYbDXUJRRhRiegdHKLqJUc2o5zbwnXtYsxzlmoahNShVTuHM3kAwc0ytA4pGAWs8\nQjaqDIZ96rUF5n6HkqlTz51ha/pdBt9J0DBY0hf44NMn2Hd9UknBjSfEORFL0inkDjB6TR5OHlA2\nl7jzcIv2mZO0VBln4jCKbCwn4sce/wD5+hIvPniBRjVjd/cQd1qgVU4p13Qq1UcZ37/D5G7A8+99\nihdm38XNHBpIGLUNgtDBkW8id6r0pX0a2Tp6c5WapLCxehp/N0ZdktDkCoZkcnT/dTZOvAdxFrCw\nXOP1l19hlsK505usbJ5jqsh86qmP/8UCg9uxRJJkRGFIFEYo+RRBEEiSgCQIEQWB0AuJUo+hM6Va\nKSPME+IgQiwUUTNQ5DxyWMDvlNlcXsURfTKrQmSHjGd7lMw2xYZKZa2CN5lz/fCQ9ajE3O4RaiLZ\nbEoQ+mhmnvl8RL6ZIJ+UmXVGpIGOGmtUVsoodZHhboc4KeKLKbfuXWYSQeJ5nHzPJQ5nO+zvHWLk\n8/hejFqaM9iFY++t0Lk6pdZo40/3yZ9pUBYEYlEiFWPa5TZCmGHbHvm6QeLLzKwpZiYhlAqoccj7\njm2w3e9j5ANkVWYezmkUiljBhI59l0zzUUIFLTKI5Iib2y+zduIMipOhqTqnN55ByyJu3L/GfLjD\n8eWnmBc77Pb3qUh5+ne/TWbOcPcyFpsmR84Qrx+iNA6oVY+RSg62aBMnKWE0oKw0+OV/8otYocfc\n9SiXy6xIBl4Q48QWN7deotAXiYMqE4YouSKqBvXjDbRcyOGhy/GNBVryCbpWl93gDtLhHuIs4Gg8\nxJk5BEmEVnof86jD3ksv4lsyVV3kJfHLaMsyJ8xLzOcuE6+Hr2xzYfPDvHL0Kkq+SaN06i34+9xm\nZ/s+kZYh2kX2Jy+Q+Bb17DTF/pDxcJtXj3a5e8fjwvtOcHfykGjPw82K76jZd68oGSQoCIiahOXM\nSMYRRn2BhBjfs5BkmRQJAUisGL1hECd9iEUEElLRwM98tFTh2NJJXDlEycpocoi+VMVLE6LUI18v\nYh3NMWoVamaBwAA37mHtzDh26jST/Q6abyHoIIg6y5LKMOeTqCLu1KJybIXpfpd8u4ATT+j0QzRV\npXCyhHVNoN2uMJjNKVfyJLFGIO2hx2UaJ1Y52L9NaXUZu3tAWowY7w1onVng6KCHIrr4io82LqGs\nDEm0HJGXIlRMjLBM4AikUkjeqGFKAftXZ6ycLDMRJeww4GSxTGFBQQgKTNyESrXIpHOPsLbG4bV9\nqs089doyl698gVCKKMkV5mLEK2++wuqFOoVmgcnkAbK0iOfKqJbJ1rCLaebI1AAlStHFGVYgoWUL\npK5GrdpAUFT+x1//ZyyXauhakU53zCNPHsezHchDuVznq19/jWMn28yyMUqi4SohyTxhfWkBb9OH\nZIRoLFAVazj2DqPJmPZCDdeKWKts8GB7n3nvJma1jF8TkU0XaQ1y5pzAbTIbTdEqVRJhQppV6Ixn\nnD/9OGv6GvPgHl+8dpVjK2cQYpvtw22OPXIcMZBIbJWj3i7bt64i6G3aywKXnvwQ4ajDXAm4zh2M\nYOMdNfuumSVIIwzTxJ2OSBBxgojYmVIq5AjDFEPKkJM5E1uBgkCSzcl8/y2Wbhhi5vPsjfZo5ptE\n3iG1UpX7swPCvIY92qGopQhyjorYJt6QGE3us3lqjdt3LyOmRVJDojOaEQcesa9iFHTsvYB8LmSh\nvsL0/iHN8006nX3EEBRZY+JF5BcMBB8EWaZ5WiHGI/Z9yq1l+rN9cmYVIy3T6dymmK8i5H1kC1LB\nRJVVgqGFkRfQNZkkElhaL5KkeQ46B+iiQZJkeMmMXKlEsBcwnk9QSNBNi3lPo7QgMxr32POKrJRb\nRE5GmHQZ7t9GEgScaYhRFlBzAnvdHrKSokYQKyn1VomoFLGQX2GSWGR+QGabPHKyzTfvvU79UZOk\nn+HbCo6Qsic4eKnN6fWPMLYe4MwP8VOPWnMVRQ9wZYnWpTXu3dphLk4paTnm7QFnL1QRdAlnnIfS\nENwGQjFk23qNUA2ppUvo1uvMLYFJOKKuXcAOO6iCQqLbXHziNNsHY7Swz0ojxtTPII/HTKQCia1i\nMeXu0S3WGxeohdCd3WYvtXl5J2Ulv0Cr1mamvYklnCVQPLyOjxBFLFVPsWPvcenEeUbCnIKyQijt\nY+s+R+E2BaeFOf//tF79u/Gu1VkmnSGaqaHmIBJDTAmsvkXsOSALyHJCFqVY4znMfVItT6aISNFb\nhy4VIUNUdDRRwYkn1M0Wx/InyDkyaqFG65FTLC4uM5UOcId7qFmVWe8+YQheFpJMQtLpnErDJN8s\nkssXMFpFSmvLDIIDjLU8Xm+IrBtoyxXCXPBWKKqX4fkuywuLzHop05GDlwbYkwHIBYZ3fKKpRaVe\nI7Ut7KmDmqtQXlhk6byBVBcR5uD6KYnmI5kCPeuApx8/hparoihFHCemphvUNtZwFYP9oxEtrYLb\nH7L92gFpLiYauxw93EEtZjhpiqzlEZISmpJD0TfxFI/Ih0AwCQ2J47UTyLaPM0npWw6ibdAfOzjO\nfa586yqGUSAYhsSOSCor5CSdbGZgeDp3XnuRcDpk5g4IHI3TjTMMBhmiH6KNYpaqMhpFzjy6iTd1\n8PMethOTEZBo4CU6OnnEnEDUqULS5OGhg2HmOb/4HKYaYaYuk5lIvzcmtDJU2cYI2/ScGQ/3dum7\nDoPDCUpzgR/7zM/ynoXzjKfb7AnbVIwStfyjrLUXOBhHeL5NPdpkuVAgVzC4NTxie7hNaqesljS+\nd3SHnK+TWj3M5gZSRUVSy9y9cu3/Zu69Ym1J0/O8p3JcOe6c98m5T8cz3T0905zhDKlhkEzRkiwJ\nEm1Dhi8E24ANmzZvdKEbGoIASgBlE7ZICqLHFOUZ9WRO6Jz79Mn77JzW3iuHWqtylS96ZMs026I1\nsIffXSVUXbwvvvr///u/h7766Zr9qWUWz4+oiiLHzgQxlvA1AzsDvhOjyxK6riJaEsPYJV8sEHUH\npJGDbJjIpCBG5ASDsZ6SNFweHD9GEHTcKEVOHRw34vHbjynOTdEfj0m9Nr7rM44d5qs1kqLMSWMb\n0S2yNjvP7bsfsLg4z/bmBlq2RLvbxZNdRM+kUi6jTU/hFseQxoi6TtA9pnDFwozG9BURwYsQZRMt\n9nE8ASHWyJfrWKlC9/CAQBURwwg/cIknAmmcUizUCOM+ai3h9Qf7+KHPrDVL8XyWcBjTPTqgnFtn\nJm+S5ANQ86T9E55b/4u8aX+fp7VFPm7cQVYzWFqVRuMITYGo2KWQlvAYcvaGTOs0ZOS6nIohzjHk\nJId9d5t4MCGRakyd19m818KMTZBU/L7DtVsv8sEffQNj1kKZGAz2x0zPTCPIPbQZjeDjPpX5Vfph\nl9XKOhP1Qzr7Eq2+x2JWZaIcoocqYz9mEvfIFXySYYFCJsXQEkxbQZAluqcbDIpD9ERGkRzKJYPj\n9BA56LPd8tHLNey8T+BERMOAg+8+4h+90ySYOkVqwsKlPMWKyOTQJymtsJYmZM2QVNURBIXP2td4\nkx9QzJ5B9GSO+8eYeZOe0aY8LvDwwz/mYJSQy4UUL8xieJ9uiZ9aZglEgWDsE7kpcZgynjiMRw6y\nnnDSGxFEYzRLZdLzEVKZKOrBj1duNV3FNDP4YYygaiyUV5lEEj1nm9Z4HyWXY3DqU5mt4qcxiqLg\n9HxkYuanztB0Ryiqh26VKJZrBHGCUAgIfZ9CxaYsq8zPasw/scJn/72beOMej+7cZbh3TKQFaKqP\nMw5ZXKyxd3QMhwPEis5od4uFm+d45uwt1L6AreqM+yfMnFvj3LUVTCnPucpFKrUSZ25ewvJkAjmg\nJj+FYFWpzc7hBAOIygiKzvoTZzkZPWLq8ixpFGDhsVBa5+OH3+OlS89y/3CTvhrQ2+rjxyGWWYIk\nQZJDDt+5z+xahscfH9FtThimO4iTGDOjc3pwjGYICFEWrzlif8cBUaXkS7zwlS/w1//Of0YhVVFz\ni6hyhfXZOv5AIYhgKAW8e+9bWDM+D3uPYKxwOGxiaHOIus7zK1exkgpOQ8XIZ8lHKgIe7BXxm1my\nms727j3S0GRr/wGNYZeLgzp6PCGXsRk4EhIGVnmGhUvrSEGClM/R2PeJ632mbiaUzwYUg3mKMybj\nkcJbX9ug0QvYvPOAQfAGu3uHPG4fIvdbKGfWcQ2RbHCOnvZ91LyHUojJZqdZv/YiR3qLqTMlJF0i\ndV0Ou5NP1exPzSxiqpMaKb7voxgKQpoSRtBpNjF0FUVWSUIfNWOSJAqTQUwq+qTOkDQFQYipFLPk\nczVswSYYtXHGHrZSYfveIbEbY2kyeUNGFnUKUypHxwLjcIA5CpC0DPlyHkHp0nNPKVcuYWZzlOuz\n+EZIbJtEUcSP/vBfQOaU688+gz1bI5caNMMjJLlIKZ2nvFYlimScO6eYCxJx4PHevXeJiwI98QBj\nykTLicSyyVBosB83qV0uUZUkVteX6TpjxsEBljtCDXyK9RxResDi2jl6Q5nl6jMYikzlzBUyZYOx\n3yFQBX7w/lt4+ZQ8JtWlJVK7ycDbQrSLeL7M4q1b5KwCK7NfRFOy2OIZ5I6OUYuQSylG3ac6XyWe\ntsnJNk9eWaD8xBOUlRn2tu5ye+/7IA3IyDJH7UOsqka7d0ywb5Iv2piZMzgHMsWqztbhY/J1CyXa\n4G76gIyZQSvFuFFE21TQ9Br6vMvopM3RyCU7VSGatIlll7E0ZMQYvyfiTXSmsueYDEes1m9y2tgg\n1CYMDpuce2KOnLhKM/Z48PgRSrbJaOgw6ghkZ1bp+nvMLcxytK8z6oyZL0zRbYX86KN/xkp+nUfx\ne0ioWJZNPIhp9vd59eErLFp1dMHDCR2UwGR5ZfrTNfv/oz/+L6GKY2REZElBSFIUMUWXVeJYxHF8\nJKDf7xInCpIcYWk2UihimAa6LOJOQsbjMXoiIoce07ML1CsLCCZkMwXOXFxn6Pqkikm2ZCDnVdZu\nGETxgOJsDhkZgZj9fRdNTTk+3eK4ecjhzglh6NHqHJJVTKqZKiJZJEIKhQKnwZjMWMWPxuiZkOdX\nX2J59RLWdBFJteltO5RmcvgTFz35pE3qqNXm8R+8g+RZ2EpEu7PFg6P3edC+TVbNcXqvS6Y0iyWa\npGqEasXsHL1KzuoxOnnIceuAxxvvImoK2ZpHRo64tHYNKZYo5hd4Zm6RmeIqucWI689fZX7GRlTG\n3L3zEFVyMUIDvZSDaoihm5iagbPnI6saX7j1y8g4bD44wbYkPt58m3bvAZKus/ODBo3DHs2Oy+r1\ndRZXroE8on86YefBPSQjodMK0JQsO4192opLARNR95hEMWYiYCYuerZD6IRkzsi4Q5fIayJkYnJR\nltwoT1tMUeUSqj9g8/R90sjnw8NvUajnUTQVQVNo9HuksY8W51hfuU7OmmNmdgXZKFIpzWIr06Se\nSsbUuH7xKSxLYGS0yCtlLueeYDafJ68tsdtxyNoKhpHQE07pjU+JJIucbFOZq7B1+unbin9qZhEU\nHd8foRVs4mhCmCoodoykqnihRxi4jFwIRskn24bDNqjKJ4N8EZxgQOD62IaIrmaIxwHjsYOlZ3Dc\nNvc/vkugCYTukJJZZbq2SKIWqE8tE9oCSTgm9NtYiUirOUF1JbLaDP5kSBi5+CcqgmWRPbPEhXPX\ncKM+XXYhGSFmSqxfPM/O5gGpO6QyYyGaMVY/pFS2yKsGM8uLJK5Hq+Hg+x7Vq3n0qkqnM2DYjnDd\nLHrOQJVzrF16EjMZESh9ynqZKC1gLm9gzJa4cf4Gi/kVztZWUMM8lcIUjqbx0Vs/opKrkFE7fNS8\nhxfcY6F0kZ173yXVI6bsGeyyRKPRQMzWmHQPqNs5srpOrl4gY+XIWrC988f8xV/9dQJVouftMXD2\nmUQ+tmZz669dRHcSrr9wjSRw6Y1PmEQaw3EDLc4RezEnp/uI6GjKIp6j4SQ+B902tZKBryqMJiKN\n7gCzFpFDQUCGaJrYzzI1vcK5a2ew9TKHExdfNTH1HNOFZTRhkWDSpKAaGJFNMgbLLlHVLDx3yF7z\nHgVrgXx5hJSLiRKNaqHEfK3CwBqjEmFbGVQjyzuD9xhKHsOgSamsockFxqM+QpJixFnMsYNZyJBK\nAZeW/hzyWYoFjW5nRNFWiDBISVAkDdIQRVBQsyrhJCKIfMI0YOIkkIAiKUiagm0USYSIbq+FJ7uk\nyIiCTRAl1FdXKczWCCYWrd6Q094J46MxF0rrpMMRvt9EijzQ8gRpQJKkGMV5GkfHKJJBNlfFWl6k\n/WiXpchn39tFECdYk4BMUcC3PDrNO0xVFjk8dpDjBDlWUVam8LVTummMoA2wFYPZokEoJHhSQG9z\nhK4HWDmd4mxK2HNJfWj3HjIRhsQ9gaPTDproYHVfxNILGLrIkXuHY2+Htvsxo9inlrEoTeVpDrr0\nfBXVrmIYWWIk1EKV4Waf+7t3KKs53LTNejFDYWpM3phlOjeHmtgszK1xctLDkUX++bd+k8tPztNs\nHtIfj8lYBmq2iJlNmPvcAq3mEAoao5Mu5cUqSjaLMBGwC2US1ebi+vPM2DNMW+dgmMOnz8lJhOSL\nzJRneOHCS5x0ZfzAxYgtIjxW1pawslXeePNNtrqPmS7rFGYy5Es6URoy8I/xE5MgnyIIIUJgEPcc\nqrkZpuwFol6Wne2PGI7v4EbbVHOQljaR7TH12gIHJ8dE6Smt8QZOT4KRTugKpF0RKU6RrFXms3O4\nYotYj5CjhMid0I+TT9XsT80stqbjRyH9VoBtB8hGgSgKUU2FNIgRUgE/DtBkkXgQYRd1DF1FSGMk\nUUVODOqlaXrdHnK2ykLBJg4cFLdHmhxxuPeAQl7ElC0kOSZX0oiUHhgpZmGesRMjDSNSX0ZKTAw9\nhzWnYNYNkBVEehhlg41gSDoaQiLg5mRsW2TZmmJAh4O999kU77Lb2CSRXYRUZGCkSFqP0IMkHTPx\nBmQTH8NQUO0BRdFGYkgYStSWXJqjbYziiGbDR7ZMpqcK9E8iJu4GrZNNvrH5VeRggbEfUVQi2kc9\n1MIYR+ow6PbwxCE97wHNzZT26TGFSZXySoU49fE8i/W1GT7cv8+omccL7tIYH2KKDluP71OvzuGP\nDdAEbv/RA3LFAqoc0xuPEIYe3shk80eHBOMxjfvHXH3qBi/d+gU0LSG/ajN2XbSciJQmaBmNTL2C\nKCeMRgk3L3wWMzuNrNhsHu1zoX6D/MJl1p+5Si1XoD8cM1Fdzjx1kbn8PGGa0t7xcfohebOKEYUk\n+ojTnRa6UeHs6hnMxZRWt4nt9ZgzMlTqFRYyn6PsaRTslMzpGrFXRkw8MI7xhRRHDondIWVTQJAr\nTGWmmWgRehKwv91gMpjDa3fpTxyOvUcomT+HA3xv6CBJGQIhxh8amJHD9OoKE1dCzZt4JMipSJLG\nCMSMGgPSJCFjFzBUmTSaYCkaiedxfHpEZxCyXJ7FEWKStMT8wjlKQh7iFBGR7f0Tglhl4omYcUgs\nJMzkiihZicGkwai/gxgruM0x8UGf+bk1xr7CYNClnr/E2HGxzAyuL+KmEmIa0ZUSQqnHyaBHtjzH\ncGePtcwVnFRFCQMkKUupOM1EivDlkPUzFvNPljHDAnaph20WsIsJZj4lr1dZLWepST2ytoG2eEJz\nvM966Wf5wgtfYjkzz6Mdj6LpsPH6Ljld/4R01iqipxJOGGFPFRmEEUqaJa/VeO6pKzzePmZ2oYgn\nPcbPW6RhyEQYsXLhBlJ2TKmoYeZPEddijo9P8RORhdnPoKguuQWJ9VuLXH76HHZF4I9f+T5f/+1/\ngBQm7H90ABjIgcm3XvtXfPDq63z7n/+vIEXkxCwfPPohkR9x2mziSTKHo3tksgr37twmm1WIXB9x\nbKHkPDzdJZJtrl3/Eu7kgMe7e4RJRF6fY33tRbzohI3eR/RbVY5jncxTz3HpF3+VQ2ePqZqOUFQ4\nHh7y5vBNUtVBjo8QBzWW3FmmFI35C1W8jkHjjxt4bp/+XhNVknh25XmWrst4WpVU66EEIuao9qma\n/amV6L/8hecRlAonR4fYtkogJciRRtZOQMqR02JSUUdCQNV0/HGfmfkpRESKpQJhIhBOJtRm6jzY\nO6GczdN39xmKIcP9HU5PmzSGA3RNZeym5MSYw8EuaS+ksSuTzUlMJiFarYAk+kiRQz6sUD6zSqez\ng9cUyc6JdCcPGZ9M8CwXGRs/GlMpawzdJlPZEm7ik82OUWIZ1dBxugH1TI4wlzB2fIx+jma/zdJC\njX4UMOpoGHFM2NOozdWJpU3K+g1C5Zijdg9JylEqXcTsTpGMEnqNbRrOBnutFjIpkp1luhaTSyuE\npsAkPMQbhiiaTokKjuDQ6fXpdx+wd3yAmHo0uiNkf4wiK9hmncPWMak6JuQUN+0zjPrIHljSLMvz\nVd7/9hvMry8SjwKGEYxdFx1IU5EbL36ZfrjJ+SsvQbBDOLTRLYn+TgslkxIqHuOejKaLdManLFWm\n2PjDE4SpECXqEKAQ9j0iQ8IoCpTSaXqDPuPJLqPOHnFsIKs+g8MAQ5pQU2R2ex2uLr/AfBlm555g\n7Dms1i8hOxt8tL1FYsb42hHXZi8giSU2v7tBrjxNRx7Rdly2D9u4yZDy+gA1PI9tzSDYm5xOPErL\nMdO5y8SDAVouYpS2uP+dP5389VPLLMcPNzClBvNzWVZWZ8nnZ3B9iZ7TI2o3yBWKlKtTqFaNbHXm\nk7l37ZOFppiUVAZJAtdxWFlYxlASQpvMp4kAACAASURBVEdiyVxFMUssLa0iE6OZFtWyjhQGeMmE\n2voa0ysGYk5EKmh0TvaIpJhEFRByEoPmY7R6Hsfc57i5RdArovVVdDmHSkihaDLutEn1EseTIZIk\n4sUK7eiA5rhHUu8xVDsoWoBZzFFeiVDyZbKzD5ibWebi6k1GGYH6WY3XfrDBTHiDNzduk8gh2UJK\nsxnS7NzmwdFDPv64ja5W6fYgioaYqY4/njBsxXR6Tdxxn0rJZnZ6FTlTZD9+QBAeUSsoZEvTjNo+\nIzViaUEl0ipEls3KQpXZCxMSb0C5cgUkA6k7zUBQqRRUAgFqVy2EVKE7iNl85Q7dgy7RKOH6zS8y\ncfcQRIt4PELVqkxwGRw5uIpLoim4jSyh7OMGAZYssd9oUrhYZrQHo1AlmzNQl6foxds4J/uMXZ+c\nb2HEOaxZC0Eb0z9xOXt9jcjR2Ils6rUKj07f4d6jbQwnT/f+e3z7j/5b3t/aRrcyFLQsfmuZaDTP\no3sfoE1nGDgBnckRV9avcGP9RRamzrMw9WXkgks2OMJcPMWYPqBzT+Zh4w3MrII3EbHVtU/V7J+V\n/JUH/glwgU864v9N4DE/AfnrV7/yLEHocnIaktNSRFkna4lMzc2SqhrFUoXmfh/VDpienWfv3j0u\nX18jnITU6zl8KYuYprQnPRRJ5tRv0/UajIYOp8MjPLnAWOqjy1kkb0CzPyY185iigiS5VOwCR8Mm\nARFRf0ypPIdhKIwDjyTsYFmLlGo5ukcjxu4xupmhnmQ4GO2SXTSZNF3KWo2B0aQoWAzVEWIyixSk\nVKcqBCcQ5QL2Nw9II49IDLi59Blio8fHt+9jZeBkMkT2i5yZrWNNSzT3VYb+Nt2tCeWrCba4zlM3\nnuRr/8vX6XljFhan6MeHCK5CfsEl7ZUJ+zqy6BPqAWlq0Ro6lBYUZC+mcyqQW7DJdvqEtspEdIkd\nj6xexvUcEjHGa+ko2RZiycJ3e7x06b/B0vLc3XqLg0eP6LYcLjwd0xmCpJYIWn0coYPfyKEVJwyP\nZKbOWky2UwJ9RDhyMGvzqNIJglZnemkWNVbIG9Mcbm7Q7e2Sq5fxkwnDHYFMrY29MMXkcEhSy5AM\nekxVV3DcIa46xBuqzOSLCInFTvcuWW2JUNgmlzEpmVdhdIhQO4/f2CFQTskrWQIhRVZj/FFKO+4h\nDC2M/A57jsit2pMUZ1Z476PvMkgHXKmvEyc+jxsS1bqCHwr8wX/+6p9aov9nzSz/AHglTdNzwGXg\nIf8n+Wsd+N6Pj/kT5K8vAr8lCML/7T2qkOCHCoatEUsx7bbD3mnIzvEp+3e36R3cJxL7mJZFnEoM\nhw6D3gDPc0iiFFFK8Tp94jRFt22c2OFxv0ugGrihQHdwTCbNELsB/aFIkIqoUsjI6WOFCr1xB0NT\n8AaTT5pkOLuMfQU9B7MzNwi8CduH77PfeoyVzdHttTiI7hPlQlJPRJ6OGEkN9LhOL3RZ0K+ylp2m\nopdJ+gL7fpOTzTaLl+fJlAUso8Ltd77DYWNEebbAmdmrJI6MqEtsbQx457vHaGZI6maxCjkK2k3q\n6hzvP/wh1qzKSy+/AGZKvTCFVFJp3FboBzGx1WYgehQqJUTBZ/3MInVfQvdrmEUFU7HQsgWy1Srl\nWp1MsYIsqIxbYAYVBl5EElU5+aGIFV4jQ0qlUODW+ZvcfO7zPPl0gWFfY732AjNGjVJuCk2X0MyQ\neJQlP2cy3jJQlnUKFzrc+rlfJJ8fglegaFfx0zrNnV0++O7XePnZFwg8l/7glJxwgbQ8oj+eY+d2\nC13J4nR7FDM1TgdHSG7ETf1JyhkJs35KTj5F1mTKRgfHC2h2R6ThMUf+MU7nAQ23Tds1mURQMeoc\njjaRtIS12QqGkcPMvEzZn6Ll+2ztfpdbU0+yWnuOVtjkyG1i5044M30eS4w/PWn82zKLIAg54MM0\nTZf/xPmHwAtpmp4KglAHfpCm6dkfZ5UkTdO//+P7vgn8Rpqmb/0bz6a//PIN9EIW1wuYjAc0Ox6W\nYVPJpAhRSm0uT68nUClXqdbn2b7/JlcuTxH6GsurS8SKwmQSYugiEPCD5i6J7DLpB+yMu0yXMzS6\nTSQi3MgjmkxItBzyUYS1qhD3PcI0gayELeZwgoQg8Ihdn2zNYNQJkZKUWGqBmEOORIScR+B6FGtV\nJN1G9lvIcpX1uVWOxy0UPyHKRmiBycS9hzsoIQkK2rRBfODTGnX4yy9/hY/3j3jjX76Kj0BhWUbL\nCqR9jcQUqM1EiKNzbD56gOA6xIqFNjfms+f/fd549E1yRZFJ7wTLXGfsRUxV5njw+HW03JiimSNs\n6pxbyLCXZBBHJzBTQnU1StmztMbbLFZW2Bo9prs3JBi3kEKX/MoSzb0+mUSiP/FRJjpRASxZA8ng\nzPIZ7t17nUJxhfdef4vP/MpVbr/SYP7sMiebO8xet/GiJkfHHdbWS6jOBSRC2pGKGo1wTo6YTGKa\nj8ZY1Qyy1SNjFcgVc/z8V36F3/sX/4jp+TqdpsN8dZ5gcopuFwm1DhP5hHFXwB2aOFqHXJTHVHQc\nJ0QriwwaHey8hqxn8OizYpWZTHI4/h6ZrI0pTSHrEwJvQCl7hl5wQnuSkk4U8kYPW6ogFvO47T5B\n2uNwb8z3f2vn3zmzLAEtQRB+RxCEDwRB+O0foyd+MvKXqtBrDFEEFVMzCYIJhYyJ74ekaQRpShqD\nIiZE4ikJKYpkoJs5It9DFkS6g2MEX6Q3chm2jykoWWKpS74q0Bv0SVyFQd/B7ylM+ir+cR9tXiZ0\nYkzZIF/JUhINlEhCzYm4bofFhYuIwwQ166PmcoQJhL5AZExIRQVVrRIejqjIMY12H1eQeXywwZnZ\ns8RxzOX8U6hKETm+QD7JYs5vED0OcDSYXh/y7sYJm+1vU5mawR+nPHe1z5R+AyNjUyjMsJS7RUVt\n8/T1ZS5cWsDMZrAUnTPLs0SJx84bDdonInff3EYLAwbtU7Sxjckq/Z5IUp/w/uiAnJXQE0U27z1C\nT8u444c8eO99lhaf5vLSczzz7HOs31giyuTZ/ugx1UoNv6uQaUm4ahetOKGyLHHx6gpGXccq1zi6\ne4RdLKCmNmaxSGVe58ZLF0iElHCkUJ6CorZGqAzpnhxwZu48mYzEZBQzGftMSEhzMfnlJaZvzFFb\nPse7t99jcf4cy6svM1+5zvbkMR1TIJo74WHzhKhfRtaKDCddFKFALVugH4WYRgHUlMXaLQy5iOsM\nqQrL9MURuhaSLa4TEdAa77K90UTUDY6TdzlttymreabNWcZjHSc4QtYUjFKN0PYxzJ+skFIGrgO/\nlabpdWDMj3+5/nX8GCnx/4r8lUQB3SBFM8cIKRiJSn84xNBkhCRCIsELPRJSBu0WCAJpIhD7PVJE\nxDhCMLPIYkAoSkgZi0mQ0PQHNO7uI4vrzK9OMTNbRS/Z5PIFZs6vQCoyPHZJNIHt3TZSyUKwbEqJ\nypWrLzCYBCh2gUw0h6q7KIbBqBlR0mfwhgEoLvnFCjuPmuhxiuFOuHnhc1R0menSHO/f+zbv/Oh7\nWNlLbEUPOHgY0WptI9sJufIM7d4HiIHFl188w+y1s5xOVO4/eI8L5XMUNAu37TBKAt56fIchAdcu\naQSBw9f/t3/K4ZshorpApryGWVHoNz0SrQUFkLyY+kydnDJLljXMQpl6qrMyu0YjusMIhWu3Frl3\n8C3Chsve1neQhyFFocqlazNUpyOEmRPEJ3xufu7LfO6J/4C5yrOEoYolqfzSz/0VpJk+sxer+ILH\nF375l7i/+SpS1kAT5uimO1jqWQ4PHvLg0SMse5btze/y3Gd/jpVb0/zKf/Fr/Ne/+d/xc3/1y1y9\nViVjhSxXl3Aab9G+v8vBd/9nWsOP0CWdjNLj3ncOuTF9hTSUqISgiyZGYLDdCyhny6hin1Ab4Dq7\n7Gwd4WslDprHCH7CsNtiND5m43jA8ajHajVDyjTlTJGV+YiMpfGo8T1ct4GXijinLVo7Lbyhj20H\n/49G+LfFIXCYpum7Pz7+KvBfASc/Cfnr/tYRHSdi1FOpVTPIGeMTOrEooEkKkqwgI0HiEU1AlS3C\nOMX1PSaBi5XLUbR12l5A1rCZt0s0xx1GD/r83d/4H9Bjjw93H9I63UNLdyksGdy918ARE9RQQ7QT\nqq7C4FEb8j7dSOdmbZFGsIeZr/Dw7cfcfPks4Sgge7mAlBgUtBKiG9DcdvCsDEpOp9HfRzM6lPR5\nMk89RW1hGX/8j7n80gb6myXM9EV+4HwPz5/QafSpziyys9NFqs7xl56vcZiMsJ8xEZKI4X5CR94j\n1sZcmc5x6o344D7U5s9xuL9H/YyEYI2Yqs1zdr5OT9lmIfs0588u88rH/ww8i8bOY0IhpT5XZ3/Q\nQU1kMpGJOOOSz67TO+hyGh0y8XVqM0uULtpE8QEbJ7eZrp/n/vcavPP7f8ALf/VJlit5bqx9jtcf\nfZ1ee5fy9BTCRCOvV9lqf5f6whTuSELPnWLvX+Fv/cKv8Y+//fe4fOlFFHFEQbrE11/5J0xPX6HV\n2OTt039IrRLTDTRK+gw7t18jMiyEFZWRIjM/ZfN4u4XunsXKbXFnawe/NcGtZ3D8gKV52DsZcNjt\nkNUyRIc6URKztrqCuzfAXswx6D8kmEwTBRHFNEuj3eLYPKB91OQzU5/h2L2HVXyT0tw1CpJDmBww\neL/N/dYxk4lLLrv8J6X6f8SfdTbsR8DfTtN0QxCE3wDMH1/qpGn69wVB+C+BfJqm/xrt/ft8gtKb\nAb4LrP6bQCNBENK/8XPX+GjX4dxCDtmwuPfolErZxsDBkGTmlus8eNhmeXGKAB850VmctggSkWJB\np1CdRS5YNLtdSprB0bjPx5P3yJpXObdwETWTQdMSXvnhaxgW/Oh736A0Z6Ej4/kxRsWAboBYVNAk\nm1ajTWGuzGC/y9KZFfDHhGlEvbzGcBigBzGnkz4fvv0B1UUbrznCUzW+9NRVPti6y3/4t/8OaZRQ\nq83zP/3g17gy+ytEQYa1xc/QONjg+CQkICVXMNnb/h6VispJq0dGL5BL8zzsbrM4naHppMwVy6ih\nxm7rNiVtllB16Pj79CdDzkxd5eN772DULiDrPoJbYCZXxx2PeHz0Q/yBRr6sc3buLKfBNmMxYLYy\nxfH2Jq0DkenFHKXyIo8ODjCEMYY+R5p4zC/aTAID56RP0h6TLIH3cZdTJghJglmwmMmcwckMcIZD\nvnTmGT7ab4JxjKWcY/fRAyTNgmwPxffJz42Rwy/gOjHm9B1GvQHtsEnSFtGMEi8/cQMhmXDUW+TO\nnR8wWzPY31AZKy3yJYvdowamJRHGGkHLo2jqpJpOJExQ1AluX2P+7AzOqItzGJDJ5OiNA2K5hSxa\nOCMPzVZYsOfI5Bu0hjnyhZBSeRnBVgilt2jeFiksXkCLu2wHLZatFyhWbP7eV377J2pY8Z8CvycI\nggps8cnUscRPQP5CNfCDFkKSRZFjFCHCjSJUMUWQP/kyN0hojdrInkx5ySKNYiRZQgxThsMhzc4R\nSslgJrWZRBHjsc7nnn8R3x2C7+L6IpmszmjY4/pnbrG/dQ9VEBk4YxQlxc6VQO/T6exy9uot2s0G\nl1bLdEanhJpGRi1i6hbf+t3XuPH5y2Qsm6dfuko2W6TtHDHpbPD+/RPioM69N95l7emncNwWl8xf\nRnJOcdJFcqrC3HqN3/nhf89Ht1sodZt8IeVkeIqZtVGGZ/jrP/NlGqOrnKQ6rX/1VT66+x6urfDM\n01eRTkY8bByRSCk5o0a/K2LnqujCCYvZJ2llEtp7Dxkd7SIHEgur6/SPR2wf3UcvC8xlCzRbPr5s\nUKiXSAKDyC1zbqVOt/9VJAoUS3VksUAavEu+ViIoiGTss4yflahHKuONd/GKIkN/xIx9hpHg8NrR\nbVaUFTx7nrffeJWbN26w5X2Ejk8mqnHijTHGH5NTazRbKWO/S9ibpR8dUdM93rzTxNJnWavLaCh4\n4zpWfg9Fm6LXOsDUNNKehJxJkWoJra2EQBhRrZkcHE0ozUoIYYgXxGCLaMhoGZ/2oYZdlBCxiL2Q\ngdKiUn6aG6sKrdEYQ5lFjLdRzFu49S7OaYdTK0JwFrGmFNxD/1NN8GcFsN4Gbv4pl/6dyV+pALIs\nEYsxcRwhyyLBcEKUBREZJTUQRBEvTinkc8TDNqFSQ1HTT2rDVJF+r4XlmoQzBjBhqbiGmqbIikks\nhuw0t9AlDalSQgwESuenuf32fWbPzdHc3mQc9UiPJM7fegF5MsQ0BBzZwjBNpgwLh5RxIvOX/pO/\nwX7jQ8p5Hbc3ZufkXWYyT3Lv6COKhVOuXlzho8cfoNfy3D89Ybk8xR9vbdE+eo3vfxXmPzNNWoVx\nIaZgmCT5lJL2BGIwYbO5z/c33kfMltC8lCcuzPCoE+H3JRQ/4vWPhtizIrI0zdnZGt95/XWeuPIz\nfLTxI/ToIXKgIacKQSbPJB1w5frneVD+ETmjyv2PtimrFn6vQTG7iJHzmXavsjM6IlYPkKx11upP\ncnrQI1ZbeHpKNEnRtClCz+VB41XKSoZrt75Cd/CY04Mxjc5HeEGFgq6zET8kF+hculYmah9R0lcp\nZ2y2xydkTp/FyIKfePS3TKKKQqUgkTVVZHL02i1G7im7eyGSnsUbH9IZN6A/QNRDlDhCMYoUqllO\njj2MvIMa6ySRR2kuS1kqElFGNceY2SHZgcYkGDG9VEA0NYLTiFxRZM/pkdx+jeO1mNny52kO96kX\nq7zx7vsoskQiNchHn2VqVab78IAdf/cnM8v/FxGGHnIKmqIQJwm6peGHIKYukiygZC0Uy2DSHLB4\nIcRzUvRsDoUE3bKR9Cy5KEvVLqFHIgfbA5760s/iDyaoVopmaFTtBU4b7xP6AbIR4bopK9eLNNs7\nxGFCfmaW+pLGcPQeTtIjr80jJxF7O31OxzJ+eR9DyNM96fDSZ/8yW63X8dIJ+UJALH/A0lKWXDEH\nxZR8aPDtr32T/+jn/xbP3vwC9caIN09NxtpDVPeUyDvPV545R0PpMz4OKFh1Hjdvc3PhMnfvHpDT\nH1AvXqcjhojCgGphgVf+6HWWbljUSld5dP8ud4SYsj3PD1/9OpKgcXd/hG10mTp7kaW5JTqHG+zu\nvU6rc8BJM4eU1NgZbjI1W+Xo8QPqU1UeDb6OE0Scqz1LM9rnsN0jDRJOnNssVm4gKwpHRx2WzmZ4\nVjpHSZtmf/9j7j1ymL88YEpdJuw7fLB7yHPXLnNv732Wlp/neHib1eIcFKsYymPMoM1f+dIc3/62\nRDpvczI6IiMss3c8AiMiFxnsnTgISKjhEcFYIhJSCorAoCNSqIicNnqctPqoko0Xu6ysL1NesWFk\ncP/hO1wpnCVtu/hxGT+y8U+OEFWZL/yFl3hgv83e3hFltQquhzcY0uq8hTETc+g4zM9d5rTRQ9Kf\nYXq6TGf/fUZ+jOpkPlWzPz3khCDghyKqFINVgdRnPBmRLygkKWgCqJpMdxSztb1HRTUIvAmZYoEo\nmlCNsnzl5gtImsnjYZd/+MufI4xj7okjDuSEINHQlRBZSwh9l85kl4y6gJo9YDUTsDy3hjPep9sL\nyJtrBN4GveE+Ow9iyuVZ6ks6G5sW7a0R4pks3/r+P0Vxp+n5J5TmBcqFCXpSIqeXeLD3mPP1yxhX\nqoxEl+ON3+Wbh68xDiZMnDq5zBIrFZOgLpEPptGLXVbWMqj2kyT7Bzz/hZd55d6vY9sTgrFAfXaF\nD9/axy4JKJLFu2+9gSBGVKaq7HS3SAYaqRIg6AGenCGn1+g+uk/LP8QsFZE1C7mqMD81g5vWOTi9\njbk05mhzn4wzizs5wV85Zf94h6x2gqBV0bwKrudQz2Q4u3KRTvOAk/4Ow9EI6aiNtpphtDWFXhnh\nySpBGHJ/+y5eaNA8eR9ZS5kKsnx/821WZ6eYeXKPb/wo4MAxSNUca5XrHIw3ubL6GbbuPqAv6eRE\nl67QJ+3ZmNmUaJTFScakHpzcVQnNFEPyydgRomdjGWPSkcLu8IRJq0t/9y52QSYd9BHmhjyz8hzC\nQKHR7iOaCvMrIXJbZqCcZaK+xiAy6e72Qc0ga9tIkUQtcGg1v8NkkKWaV0lmzgN/OtDop1YbRpKQ\ny6hoShYFEd+fYKo6vgembRAnIYWCzuFxj6GXkIgacZQgkDLpDQnjiOPhhMFgwGyu+kmnkVTgRlfh\n57WzXFCKqKrA0NlmeLKHEe9y7H4NUQh568EmQQoPWw0cZ8gH997n/ist9j6exjYXcdI2G4N9zl0+\ny9IX80gDB1Msc3S/Rz2fQe5m2XhLx6jMMxjo5Kws3vFjNMtmc9Tm97ZeYaE6y888+Xe5tPYFkknI\no0OBs1PP4DXGTEKHqeJNWvt7+LMiH3z4GjPlv0agnuP60y/TcDawCgLrl58hq5eoz6jYswWOx7so\nkkr5vEIkWiyunUERLcajxzSHPYwZlctLz3OlvI6tRozGXUajbaQtne13ZG4+PcPCao6Fm19kcHrC\nWXOFxJ/gDw8YHbeZHLrc/6jBdvMHSGYM/cuYsUDx7DpnZy+zvHqB0/aQ4+GYWNIR4xUSwyNr5Rj2\nS/zLN7/GaXufrrDDx2/C49NNDDmLMNnm45NX2bl3wvb+x2w/OGSxVmLkqWijGVYqNYyx9gnaPYhI\nVBm1Nua567NcfvkFzs+tkdNEDm/3ePc7H9N965h8Ps8gVeg4E56on0dXXHb9O+gzRYqZRUadNs5B\nnlZ8xEB5DyHNohdlbL2CgoHrj5DSEyoXb9Eet0iUNp9f+yKW9ucQwCrIGrIyISEgDFTiRCAKAwxd\nRpY0NMUga0YgC7hjiG2XKPCRxISQBM/32BjtUFle5vygA1NFhDAhElOkcMwFMcdZY4UHw6skl0d4\nzlXs8jf44L2P8ZwMF6eXub/3KhvvKFjaMpWbAkulhFdfOeaJz57BifZ4vHPE+bl1nv6bv8jjrXvU\nLyi0PrhD5twMz/78WfxkTLN9QtTR8OsLLM1opFjsfbBHTlrhjR/+Ls2DLZ6+8STZgsIfvvoHZLwc\nkmLycP8j1p+4Suoa3N7/Q+b2q3x4/4fY7vOU4nM0hQfk5SFjN8A2TCYTn/nZFfYGEQktnr+ZR6vP\nUirUUAsPWa2t0e59yDe/85vUz17BWJqiLK8zN2qzcekuys4qzckyzf6b3H9tk4s3C8TLh5ypvEh1\nap7XX/89hlGIYA/wT2QmTkqm4JOtL3OwdciDh/c5P/cEJCnXllZ5zzshVzQ43g4Q5EVeulrnXTPG\nSUYYksE7bx8wvTCLl+yxt3+EFmYQbR85F1E/l+fe+29TmZ3hJOxw3BVIpQzjwYRUkAjGKRXTYths\nIbcO6YpTHO61WDo/RZiOGW0YKLmUrOFy3Djhwxhs2aK0ep1H7v/I4GCZkbBPoQyTqExCi/6gSq4j\nU5hep5oVyenPsTHcYXz3dxACgVz+CX547w7udP1TNftTM0sSCNiyiaro9JwJkpQQ+SlJIiCpMgI+\noqBQsDIkQUwqQeD4xKlEnIIwGZGdWqEgA6FIcPoINT+PFAO9HqkdIigyv/7lX+Be94CvRm/Rb5WZ\nrXo8/8x/zGn3IaVkgblnu6yZGY6dEo8fvc3aUzO0GifoGYuzazNMJj5v3n6TaVFEtkXWPv8MehJg\nmWWKyiqVwiwdu8WED2k6d1gt/QLnrv0F7r12SD6ncWb5l9g9vs22e8Bi5QL54oQPjg8YHXVZtJ5i\nZ/81ErHGq/uv8/nPvkjkTnh8dBc3tnjU3aQytcRM9QrR7gaC7lO5eJ+z1q+w2/wGs9Iiu96PyHbW\niIwdbP0CW8KPyIzvkMpdPniY551YRDUWyC60eLDxAFVMOP/ZNQKpTRq7nB4/4N7Ra2Ttc8QolPWz\n9MM7rE4rfNDcx0oqFHSZex2VnJjneOdtQqOMP4hI5BG2ZxPpJp6ioaoqV57M8PD7Ltee+AyD/j79\ngUwkgpd6yGKMafnsRTZXb93i7oP3UPoC3V6IYTVRsMnPnScnQmf8mG43JE4cilMSRmrSH01whzrV\nepnQm3A8CglNiX3XJTOMuDVXwgx/lr3om1SyeQZeBN0B1ZkM0vSE1sjEHe+hSBJvtN8g689w2w0p\nlpYIlFepzF2j05A+VbM/NbMcdPpMl7JECciyQOSrRKJHTIwchyRBRBp+UhgQCjGBHxGLCaIsI4gK\nQrZKoaTi7ndJBZDy04TtbVR7msSzEOSQJAiQ1SwX83Uymc/z+wcu7nzEyf13cLUCzjAhGM/waNr9\n39s7t9g4rvMAf2cuO3u/L28iKV4kUpRoW7IiX1TVkq3EjoM2TlK0KFC0QB/60gItWiB106e+tehL\ngDwXBdKgtV/SW5ogTeG6iYLYThRRpiSaonjVkstd7pV7m92ZnTl94ApRFCtRYpubAPsBAx4eEvt/\n2HP+nTkzO/+wkX+Nce8wplLHH5A0my6KAqZiUS8WKPhiTIxeYmw0APUOW7dv4vUK4sNJ2q006ZyN\n12PRbP0XTuM8nXCapmqQb+5Rz9jEZ4dJxGYwzWU8eBiseHiv9hYeGSMaGkGEajTVCtmqQWTgDLp0\nUFolxH6H1eICjhgiZR6hsAatuRoJ+TiFXJOTyTNII0GpHOPm1ndJWNPIfAyptnFEET1gEx228elT\nxGUDLezQKrXYud5h6JknuXlzmWNzKTLbFagXkKNjjIemubq5iRMrs5Nep9HeJTka4vv5RU7Mn2Wz\nuMx+xktooszRJ2YIdrxkb66RPFNi5XsWlYZGfWcNN2Bh5yT+WIrSXhuf1SA5cJbHhhvcWLiDoiaR\n7j6RkRbDwwNk0xYeX4mGezDGesKLiCQQDQ9Pn5/j6ttL4A2gGFW0jorr8yBcH4oIEA7ZrO+9zU5u\nHcfnp1xt0CgqIA2E1iRiRYl5YxRLq2SKEq8RQQ+0kR2DCdfLWs7DjlFA93YeOmd7tmZRFYOAXyBk\nB9s+uBlLaDq6poEUCH8AwxBIQd+qEwAACm1JREFU0yERimE7Gh7Ni2V3UHSdUDyIk2nQQqJoAUS1\ngOqo2O0iqCaYNVTLRrbryI7NUYK8nLyInZtl01yi3Skyc2KWaiGHIlzOjFzGSvpxEIzGjpCKpFi/\ns049U2Ju5FnGJ+ZRjTqZbIamT1I0cyztr/PGm29w+26B6clZIqNnsSuDDBkJkmMfQwtEsNQagek2\nqseklNtme6+E5vUiJ8aZnZwjNKLS8WRpRzQyb+dp76epdNKU17+LRwuwu1clMnyOofEpcs02UzPH\nWcj+K4vWN+l4dNbaebYydynWd7EMSVAzaVVWyKybDMXmiKrzzB79BG7Rw5HAM0RaY3T2TCIJjdZ6\nmdTRUSKDlwgOORw7lsRU6uznFwlXwtS3dTS3TcOUJH1JEsFJttaz6EMOyYkg9bYHq+PSjn6fPSWN\nKM0zoIwyKFL82rOfRTei+NwhqtkC9c0iMyfnWF9c5NbaLarpIqkjE5hRk9DwNKWOiR4MoooU7dVd\nogMxmjtF6isGG8sb7OR3GDo5wOyTs2j+UWqOgqtr+KIOVqtCXXepkUGEJLJVp13zIHSbRlPgLx1j\nLxNAs2x0T5ATg5MYtsWencYIuWx0ysQSoxR2bbIN+6FztmfJIlsSw6fRETZOxwZHIeb14vXoKKqB\n4VGRqkI4nsBjOBheDUU3cFom/oCH3eVV3FoV1RfBCIbRNAM3naHTbCMVAaoHF5CORDga0m7zuCeK\npniotlT2sjv44wbNgoOCRU4aJGQC736VrXQGRauhq2F212tce+cKyYFR/D4Hrz/P1e+8TsFfwbAl\n/lSUYECSzW4y4CYZP/4Ctf194vsGo5HjzB59ksHwCVwRIXvnh5g7FeR+nWbaZOXuVSqVLex9yeDA\nOHduZ4mE/UTqKpODs+xcu4YMZWkWbuBVStjZ92jVdok7Z7Bxqe3fxSyUsRoWmXyFQe0cxmAKZfAY\n0UCSrbu3yTd32NtYIV1ZIb3RZquxBoEATc1hq1Ll0tN/gLBWOTGtEgyeJOjxsJhVufKDGwxrCbR4\nmHbRoZzNUm4USE02qWebPDt7gWRylPTqDUTNw9j8PNXaOrGxKfzDE3i9Izw7PQtBg+nHT/DKn/wW\nlbqHaiHKyMAQp8+fprj1Lv6On1SqhqPV8EiFUKeFNaKzl61TaAkawqKQdkmOSlJKEKchCfoDDKba\ntBt3UdUEwaDBialztMwgmUWJVY4j3Q6Nss6RaR/tliAeHMFCxazm2SrmaPsTNIsCnxqkbFco5Hfo\ntJoU87WHztmeHYYFggLD1bBUDVwTnxGm2trD0FJ4fCqq1JBS4AsaWM02ml8SiPlp1duIRgfdq5FN\nRYlaNrbbQVcN1GAEu9lCWmU6ShhV0xCqA50WiitxdY3P+C6yXP0/9LEjNJqbGK5BNDGPuPUWjYiP\ntsfHWOwYdzaXiETinDqdZHrmEv/2D19ChEP4jBa+SIiE0LA0l4QMkLEdhkSShbUVEmtlUnOCpcVV\nng69wK1vX8GIeKlZktBjE8xHptnMb1FcXCQYG0J6vFTsEm1pkxwc5EhwmJ1KkYpjEzo1i9UwiQxP\nktu/jdkusZZrUay7TKWexx9p0DFTbOxmcGoa5fAWZqGKz+fnmdMvktgeIJu7ysq1VXzHbBxlicr1\nQQIBP5QsjPEOoUiE9Moyw9tDdFrvYgRipAaD1NdaGPEJKqU04dAgahTa9QJ3awKlZvDW7uuI+hH8\naojwzDTby+sYms6Jo59AyV/h1sLXqXkzCEVjbGCIndwC/qltToU+S7b2HvXdZeywy8RsiaUfBjl3\n4QXMeBPTXSNuOKhUGTp6DDcv8dfqVJsQmIpyaXKG7731DRKek+Q9d0iFBsmX01x97w1Uv0JiTsWu\nuLSaDqobo90wqLducDTyBPsZC9NRMII69fImCcOHaYZI+lS8fo0b726Tmk09dM72rrqL16DRaVIz\nwZGSQExFVby4HodAIIiqeDA8AaLhEA42qiLwCotCzkSLxpFtlbawoNPGbJlIXUP6dLyuQG1baC5I\nBEJVwLKQbgeh6ZwejrNdL9Iu7RKyA+jBGGqnwNjzH2fg+MdIzZ/AipYYnx1H16sEYwFiaodzL14g\nMT1MJJUk6EshzSZttURi/gRHx+cJDJ8kYidoxHf4wbevMRAJc+XONwkdH+T4Y/NEh0dwNY1KuUDI\nP8iFy+eZuXiRsalRzj/xFLa2SDabZurUMLanhi/YwpJphH+J/P4K9eY+Zy99jtDIGQxp8+d//Ht0\nSgbe1DyOrXHy1CmShskTySmitkHT2sL1Z5g69Rynz11G070Q8/LYyxEuvuTDNSS+wTb7hatUHT9K\nZJLjZ54lHgtSVnfBdXnq+csE2jaRYIIwM9hmgon4k5w5PYSyE0Bvu5x66Tlu/e+bHIu/TEx3ubL0\nJaoJi9nzTxOeTHPydxbYUP6b+FOrRKIqtQqcP/5H/PZv/g2vfG6es9rvMyA8bCxd5+TxSULGEBdm\nfpeocpSx+BQivst2Lk8x0yK3u8C/fPU/KNa8LGZuM+yHanOTSiHO/h0dvean6YAwHbxKjNCAw16m\nRv2uQb3lotVthBnDaPnRmj5y6Q6N3BqxhJdqy0tiIEl9o/XQOduzJ38detA+fX4Ofmkek9enz68i\nvbuC36fPrxj9ZOnT5xE59GQRQnxSCLEshLgjhHj1EOL9oxAiJ4S4cV9fXAjxP0KIFSHEt7qlnu79\n7Qtdt2UhxIsfoseYEOJNIcQtIcRNIcSf9tDFK4R4RwhxXQixJIT421653Pf6qhBiQQjxtV67PBQp\n5aFtHNwwtgpMADpwHZj7iGP+OnAGuHFf398Df9ltvwr8Xbd9suukdx1XAeVD8hgCTnfbQeA2MNcL\nl+7r+7s/NeBt4EKvXLox/gL4Z+A/ezVGP2s77D3LU8CqlHJTSmkDrwOvfJQBpZRXgPID3Z8Gvtxt\nfxn4TLf9CvCalNKWUm5yMBBPfUgeWSnl9W67DrzHwW3Xh+7SdbhXAdvDwYdYuVcuQohR4FMcFHK8\ndxaqJy4/jcNOliNA+r7f37dM0iHwgco4fVCEEBMc7O3e6ZWLEEIRQlzvxnxTSnmrVy7AF4HPA/c/\n76GnY/R+HHay/NKdp5YH+/afq4zTB0EIEQS+CvyZlPLHvltxmC5SSldKeZqD6jvPCSGe74WLEOI3\ngD0p5QI/2qs86HqoY/QwDjtZHiyTNMaPf0ocFveqaPKLlHH6RRFC6BwkyleklP/eS5d7SCn3ga8D\nZ3vkch74tBBiA3gNeEEI8ZUeufx0DmNhdN8iTuOgOswEB8fKH/kCvxt3gp9c4L/abf8VP7l49HBQ\niXON7oXbD8FBAP8EfPGB/l64JDkoXQXgA74DXO6FywNeF4Gv9ep9+Zl+hxHkgTfkZQ7OBK0CXziE\neK8BGcDiYL30h0Ccg3pmK8C37k2c7v//dddtGXjpQ/S4wMEx+XVgobt9skcujwHXui6LwOe7/Yfu\n8oDXRX50NqynLu+39b/u0qfPI9K/gt+nzyPST5Y+fR6RfrL06fOI9JOlT59HpJ8sffo8Iv1k6dPn\nEeknS58+j0g/Wfr0eUT+H9bjcOnSi+aPAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# draw box of the ref using 'green'\n", + "plt.figure()\n", + "refer.showRef(ref, seg_box='box')\n", + "# draw box of the ann using 'red'\n", + "ax = plt.gca()\n", + "bbox = ann['bbox']\n", + "box_plot = Rectangle((bbox[0], bbox[1]), bbox[2], bbox[3], fill=False, edgecolor='red', linewidth=2)\n", + "ax.add_patch(box_plot)\n", + "plt.show()" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "IoU=[0.09], wrong comprehension!\n" + ] + } + ], + "source": [ + "# Is the ann actually our ref?\n", + "# i.e., IoU >= 0.5?\n", + "ref_box = refer.refToAnn[ref_id]['bbox']\n", + "ann_box = ann['bbox']\n", + "IoU = computeIoU(ref_box, ann_box)\n", + "if IoU >= 0.5:\n", + " print 'IoU=[%.2f], correct comprehension!' % IoU\n", + "else:\n", + " print 'IoU=[%.2f], wrong comprehension!' % IoU" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 2", + "language": "python", + "name": "python2" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 2 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython2", + "version": "2.7.6" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/tools/refer/pyReferDemo.ipynb b/tools/refer/pyReferDemo.ipynb new file mode 100644 index 0000000..5e0acca --- /dev/null +++ b/tools/refer/pyReferDemo.ipynb @@ -0,0 +1,229 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "from refer import REFER\n", + "import numpy as np\n", + "import skimage.io as io\n", + "import matplotlib.pyplot as plt" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Load Refer Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "loading dataset refcoco into memory...\n", + "creating index...\n", + "index created.\n", + "DONE (t=9.88s)\n" + ] + } + ], + "source": [ + "data_root = './data' # contains refclef, refcoco, refcoco+, refcocog and images\n", + "dataset = 'refcoco'\n", + "splitBy = 'unc'\n", + "refer = REFER(data_root, dataset, splitBy)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Stats about the Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "dataset [refcoco_unc] contains: \n", + "142210 expressions for 50000 refs in 19994 images.\n", + "\n", + "Among them:\n", + "42404 refs are in split [train].\n", + "3811 refs are in split [val].\n", + "3785 refs are in split [test].\n" + ] + } + ], + "source": [ + "# print stats about the given dataset\n", + "print 'dataset [%s_%s] contains: ' % (dataset, splitBy)\n", + "ref_ids = refer.getRefIds()\n", + "image_ids = refer.getImgIds()\n", + "print '%s expressions for %s refs in %s images.' % (len(refer.Sents), len(ref_ids), len(image_ids))\n", + "\n", + "print '\\nAmong them:'\n", + "if dataset == 'refclef':\n", + " if splitBy == 'unc':\n", + " splits = ['train', 'val', 'testA', 'testB', 'testC']\n", + " else:\n", + " splits = ['train', 'val', 'test']\n", + "elif dataset == 'refcoco':\n", + " splits = ['train', 'val', 'test']\n", + "elif dataset == 'refcoco+':\n", + " splits = ['train', 'val', 'test']\n", + "elif dataset == 'refcocog':\n", + " splits = ['train', 'val'] # we don't have test split for refcocog right now.\n", + " \n", + "for split in splits:\n", + " ref_ids = refer.getRefIds(split=split)\n", + " print '%s refs are in split [%s].' % (len(ref_ids), split)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Show Refered Object and its Expressions" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "ref_id [22758] (ann_id [540661])\n", + "1. woman in front\n", + "2. lady smiling\n", + "3. woman\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAMsAAAEACAYAAAAdo4LwAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzsvXewbXlW3/f5hZ1OPje/3K/7dZgOk7oHhhkYZhAiFMmW\nymCEURmDbRkJlWWkYowBSwjJhUpg2QiXXVgYHDRgE0QQaQYY0gwTu6fz6/D65XffDSfv/Av+Y597\n3309TTNVZuiR662qW/ecvff57X3O+X5/a63vWr99hPeeO3bH7tifb/KNvoA7dsf+XbE7ZLljd+xz\ntDtkuWN37HO0O2S5Y3fsc7Q7ZLljd+xztDtkuWN37HO0zwtZhBBfI4R4XgjxohDi+z4f57hjd+wv\n28RfdJ1FCKGA88BXAteATwDf6r1/7i/0RHfsjv0l2+fDs3wR8JL3/qL3vgZ+Dvimz8N57tgd+0u1\nzwdZTgBXjjy/utx2x+7Yv9P2+SDLnf6ZO/b/S9OfhzGvAaeOPD9F410OTQhxh1B37AvWvPfitbZ/\nPsjySeBeIcRdwHXgW4BvffVBv/g//jCDXo/Qe0QyZKMbsUgzRmHAo+fO4seXGO3dQBrBfu5Z2TjO\neLSHxrN14jh7o33OPPAwj5+/yANvugfdP45KAmJCimJMnc1wPqDV7+K9x3vFb/3aL/Ged72LOIxp\ndTrkVck//ef/Ez/49/4zClMhg4B0ssDh2buxTRI4vHHs7Mzo9QbIIKC3MiDqdPmRf/GT/MHv/T41\nEiE1CLU8TzMPCCEOHx/dBuCluG37wf/x/j6DlRWEEIfHSicAh3QGV1bcnYT0Qo9RAZfzmqkOcbpF\naCwIC1LgBTgBUt4KHJzzSCHxGITweC9QqMPz2yU8pJTgPXvbN1nd2myu+8i1N+Y5GkA0ms5rv+db\nx4jD/wfHHf1/1A73IQDJ7vVrrB07DniCQIF1KATC+ds+K6OXn5nS4GqUCpHO4xdT3tXv41otPjmb\n4eMOOghwR84tpENK+KPf/OCroXpof+Fk8d4bIcTfAX4bUMC/ei0l7Bv+o2/j4rPPs7Hex6kIl+Zs\ndtfRezs898IlutIynbWp5zdJnWDX7+EXOYm2pJeus7axQWk8UkI76ZHohLzKcKok9IJ2lFA7jy9K\nrDXEYYtu1EF4Sb7I8M5jnQRjMWmJy3NqW5MkmjDuEa0OyIo566c3WV+3uEChpCAQEq3g/d/5Lfz9\nv/WdfO8//Cc8/9wF0BLvPM4D3jVfmmj+jGu+UOcdHg6/JCEF3roGjIgjoGyA4gUYKQg96Lpio5Ww\nJqEUgkuZJVUhwmsEBqscCvB4GnjfAl1d1yipl5tvAdQdOZ8SAnzzao4AeznQ0e8XKcXhOHAL9EeJ\ncpQ4QhyQtjkHt3FDgBfL6wbvPELe+kTcqyYa5xxKCrxf5hDeg5QIQBuPlwLtHUYI8A7rJe0Qhlrz\ndJpidIsQifEWLQOEWH5mUmKdfV1sfz48C9773wR+8/WOya5fZefCS/TMKUztsd4xlIY1ak4+dA/F\nNKM/zFGDh1ntrCK2Nth/4nmuXnyCsrZMFyn5znWMKfnox/+UaVriixQZSrwI6ScBVniiIGTYHtIf\nrjJcW6O3uoE0hhoBdQF4itKRTVK6LUe5v2DwyN3M9i8RK0c2muGERiGYz+YUacrKYEg5mpDbm7zv\nix/kJ/7JD/BffO/38/LlbZyKCYXDGIOztgGelngvwMsGG3gQAg8oKZBuCTA4MqMCXiCNwVQLTrcj\njivPXEquFI6ZVvhAIvBI4RFSYo6McYBH7z1KNTRCeAQSvEYIjxDu8BiBOqTZwesPMf0qb3Cw59Ue\n4dbh4rZ9AnXL0wiB9+7WjL70brfO5Ztp45Dz/vYxvcUJgRAQSIlzDuMsXoJEIGk+B4tDAh7Behgi\nlGCUGQg7WOlwQmBxy/MIvFDLSeDPts8LWT4Xq2rB6tZpom4HXRYkcYtiPGdtY0g2H9OSGqfB147R\n1WusDLuYOqMlDYoKM1lAteDc5km2d3c4ttKn31un9JKtBx/jyaee5C1vfQBZ1hTWUEznyCrjY3/4\nQWb7Y4TWBKHGlClXr18iRFBawXg0pveAZTyfc+7cfaRlias9Ck1noEmSNnVZEnfa4D0rG+skpuBn\n/8UPc3VvwXd9zz9gUUq0DjE6xDoBvrwFNkAtweC9v/VleU+StBpoeINazraBsZzoJgyFofYFl0tJ\nHrRxQqAEODxyCSyx9AyHIdwyDHOuAefB81sh0u2gllIeeoik0/6s8PFoCHXw/LXqdEePP7pfSYX1\ntiHwcqzm2uRtr3XOHV6LkoKk3UW4xiPJI2Ma65BK4YU/DNnwEo9HyJBCe9qznBODLnOhSfHoQCKk\nxOGa0FQ1lPIC3J+D2TeMLM+98jI3L99g5Z2PYZSmrmp6nR7ZvKIqK1qxZjyf01UKX6S89MTHiUxA\nHMYMekN8XaOVIp1MCL0nEBZhPbGQmMWUYnKdIL8LmSSUuyN6seSeE+tgauzpLUprQAS87xv/A6r5\nDuOXzmNNzSMPPgw+5ezpexAqoKgWTIuCci/lpVdeZKszZGc25h1ve4hLL1/k3KlThEmb0XjCPadO\n8OF/8wE+9CeP830/8IOouEsQxdRGYYy5NTtahxMgPIgjs3SctHHCNrN+XUNtOZN0GIoaK2NeSg2z\nQOJkjBa2IYtoyID3SKU+C8wHRDkK9FsAPpp33A7wdrd72/YD+7PysINwT0r5WZ7lwCNZtwxPj+Rs\nQsjbrtm5W5AVDjCOYbePd03cZXEImrGEkFhvEWqZL1qP8QYhFMIaQNCqavpBzPOLlFIHhMIg0Utv\n23wHXggcHvH6juWNI8ub3/Iw0VvfQrS5DjrEC4mfLbj04iu8/OwFNldXybMFwd4IqUP8oMf2jT1O\nbLRRMiSOu7SSFnWZ0RUh0zylrBVrp09jhCIJIVvMaKEJdAuvDFEc4J2nXGTE3lFKj8nGTC9fhWqK\nUCFX9y7SKXeoM02oNIGGgXXUvYREt7jvoQd4QBZoHfKudz6G0RKpE7YGAyajfRbZFO3m/LN/+kN8\n+I8/zq/91oeI2j1UoDB1hVLNl6OcP8xhEB4hVTPTuWbm9abiRCeirWq8D7m4KNhNYqRwaGloAjDH\nAbZePesDGFOhVHBLWPAe72/N2kfBcRT8Rz1MM9/KQ8Afgvog5Dv4W77u4P8BSZejL8dmmSMc8W7+\n1v6jyf6t3MjjcHg8UumGQMtJxh6+52UOJQXOS4R3TUjmYC3QyCjh0niCSGL8MvwVzjcJJRIpPBZP\nIF6/kvKGkUWJmrrKSV/cI253KGuHjlucvucugjBgEAdI5WiZmu3phJPn7iO7N0dp28yiUYypKz7z\nB3/I6eN3cWmyy2J/j+KJz3Dq3ruZvXKRD/36HxImK3z9t3wzk/3r1HVFkWX0ojb9VszaWoIPFM+8\neIljLcvqIKZMc9phF5MY0nKOqQTWOjY2TyI6MaNsypo2uNBh7YJ0URC1++zOZrTCCOENe6Pr/NWv\n+Qbe/o638t/80Pv58If/mH/0wz8CSiMQGOFxgcJ7iViGUN57kBaMRpUlx+OEE75C6ISXFgXTWBPa\nAsIYIdwyZJAIwW3A/GxPYDmaUR+A0FrzKg9gl2A/APJRMvnbkvCj5zkglVp6tYNtt3kWLM7d/tpX\nh4SvJsqBICLULUHAOYdHIJc5y4EXkkul1+ERshEJvJQIU3NssMKoNNRKo7xqFMCDvMsdjCtQCjhy\nja9lbxhZwiAhzeboIMQ7gzQ1l89f5oFHH2NlfQCLBVWxIPSeVihY7F9D1zXjGzdpDfvEnQ51XSKd\nZn3Y5q67v4xWu8N0tMelK1e4vLCMpobt51/gv/y+R5Dn7sfKCLcs8TzxxOP02wHbF68SJj3SyHBz\n7yZUNZfHU0xdonRCuih46KG38vLVK/gy4+yJM5TzfZwpqE1NkiQgBGu9AVY0IJtPR0QakAYWu7z3\nbffx9v/zp/k/fuFX+cAv/CJh1KbwvvEwUkJtCFQD3Loq2Ao1W0rRCjUvpQU3tULpkFYlyKVswCIb\n0cA5RxAEWGs/iyiHocbSIxwA1Fp7uO/Pkrpvz0U8DVRuB/yrX/vq0Ovo41fnPQcEPzrGZ+dAjRhw\nlIzOvb4MDY3n9tbSL2vaYcSnRjsQRMvjl/sPBYeGXNLAa1dXbtkbRpa92ZwojJiOx8SdhEFvlXtb\nm+SLku3rl9nsdGhHEaZ2qEBS1xVVVTLY2CCbL7BUFGlGr93Fe0sSaK5f2ebl808x3t0HH4MMyOsa\na2ssFhVblJbkpub+Nz+EHl+kE3iGDz0A3iB9Qa0Nw+P3ce2ZJ+nHEShPWZYcP/N2rP0Mn/jUU2RF\nRpnuUrmKSMaoIOLYxibWWNZOniAOBkS6w3w0YnV9lcoa1kP41m/8K3zDV7yTH/rvfoIb+7sUOLTw\nSG9oKxh2Y2pZs2ocSdzi+TzjaiCJfYJAUkaNVxCHYf0yF7D2SGh1RIVahhzOe+Rt4c2tcOPo46NA\n/ayZ/8gxr5XAv1YOc/DfWnv4+M9S0Lz3jSys1G3X8WoJuvGUt++77T0IELYJN7fimFBL9oRDSHV4\n3KvJLYRoJP4/p/nkDSNLHLb52Cc+xWNf/E4WRUk9rZkvZtBWlF5ycTQm0QoQ1MbgvGOxP6ETWTqt\nFfame7RXBhALuhvHePnidcb7c4o6YDSvqCvDPJ0QRposX5C0FdIEjRvXmjiJKK7NcM5Qa8uAiiBp\nMc7nmMUu7XbEYj6GekI7FPhLU051BXkoCJJNkmCDzpm7+cD/9Tt88zd8FWG3gxEBUrV5j/F86hOf\n4K67TvPyC89zbbyD8oI4CGlFCe//nu/gyRev8Bu//muc2hyy1lYErmB7NGext6CfJLxSZlzx0FYa\nLxxeeaQXKOERCliCxh+orM5SyxYej3IGQQ0+WIZS4lCFFUIiaYDhljmBByTqSGgjkFJxEKc0AoK4\nJSELj7XLZF4KvDtaI3pVkv5qYWHpVZRqpGrkEflaiqWSdeu1B+acw1IhNVi3fE+eVxFLI52lCDXt\nvOR4P2EvLzFhSMsLnATjPXiHkLd7t1eP9Vr2hpHFLHY5e3wNny84trZF4BRsaSrvyTtztARlDU5Z\nnPdkeUFW5Jx+9EHarRVEHKIFbOUFL33mGZ55/DyLNCXLM5QX3Lh+hVA3M1GWpQRhC29zgl6HRAV4\nBXOvSXSEM4JxMacXdqjrGO/bVLOniPKMQTdkMd9nUWqCuEeCpp5OyCnZ3tun3ZeY2Qi7mFGHIbNx\nTj7Z4+H7zhJoyUq0wulHHuDC08+gpULiaceav/a17+Vv/vtfzY/8w/+WOBLMpxPe9ObT5Gfu5dMf\n+zTJoM/KvCBDN3Kpamo0zotDj+FFU7ST3uOEa+oXfkkCr1gW1m/VTpZgNd4hhWyeL4uL/ghAD3zJ\nZ+cRzTHO3q6wWefQS49w1HscEOToNuccWutXCQC326u9xVEpXEow9rOr/s0VWxAK7R19PJ24wyeu\nXMWsdPBHuwxe8zz+tuLra9kb6Fk8N65c4OyZ+1AGfCypBMQyQPoOoqoIVMBssouXgrWVAaNRD8Zj\nqkyQdHqUdY7WAa2gQztJSFoBZdVmvDtiMhsThAkqKPng736Qd3/pF7O6sorSHVCSxXzBysZxTF3y\n+7/9YU70Q3Yef4bU5Az7x9i5foEzp4+x4RKU3uD0sR6lqQm0Zm9nRru7QuQ9vh5jhaSYp2T5iNVT\nJ+nEpyjSBcoHlHnOwHsiFTLs9ZmPR2ysDkmrjP39a/yn3/6tXLh+gUe+6GGe+Mzj/PbvPI4f9Bi2\nV3n7McHFGyOs9IznM6y1mFqgtAYhqY0ljiNMXRIKiZA1ILG+UbDcUq5WSiEVWG8Qy3YWACnUrTAu\nCIBbAHfOHIZ2TQuMxGNvC7sOAH8QOr06JDua/AOHxx71PAd2tCZ0oKodJduBNa89qMEc5DFNXuNF\nk6wH3rApPaktGeuYUOplO8Vn152Mad6nsxb5haqGzdI5x06dpK7AmIyoNaSVxNjaIJUliCXT/V36\nrQ42kNzY3qOoaopiiko65DbDK8Hjf/IRpE+IgwinJLWpMbak1YrJK0eW5vwvP/k/87M/87+CFjgD\n4LBVzb/+qR9nZes4SQKnzxznLe96lCzPqDLP+NiA1X6XMIqY5xmTuM3LL73Emx99K8fObVLMx6xK\nxb22y/mnLvLilcu0B2062zfxKmQlSbB1yemTx6hnKXE7wVnH5vETGFeTLeYoqXH1jK3NTZ596Qof\n+eiLvOexdyO/qI3xAiFKMJJ2t0dWl+BgkVcEYYgSEh0qZos5w+EqdVlgJWzfHLE/mbM7mTGZ7rO3\nvUuazsnLkrIuCaRAeUcmFN43RNFBQJ0XeCnodDoEYYhH4h2YugIUSvuloCCXBPO3VC3X5EWNCnwL\n3M77pRDx6vaXppXHOocS8lCGbjyHvEXg5WOW+9wRlc57bvNa3jukE3gkyjk2+1126wLf7hAYgw9u\nkVmqW9K4lBLrHFKJZZvNn21vGFm6G3fTqj1JLyZL5yxG2xhrcCheuXSV++69myqUZFXFfDInEJLN\njS1qN2U6XVDmMy6/cpXACrzPsLbEWcFsnLIy3OSyvoGtCryvWBn2kEowz0p0oOl2Orzp/vvxleWV\nl1/gnrs2WekPKCZznFBgJwy6iiQWTGe7zCZTkmzI6Po+/p4Fu1cu0+51mY6n3NXp8MylHd7z2MN0\ne11msxlisEU1m7A2PMEzT36aRx57lN5ggC8tqXEYAU5KPvX4S+SLMUEgqFXAffe9DRm1EC5grT/A\n+ZL5eEaRZoRKU+clkYJAegaDLgpHOwnZ3hnT0YJeICHxdIKQk1vrdOVJfJXSjgWj8YR3fdm7WU88\n08Lx63/0Cb7p67+avZ0dbuyMePfXfwdht9NU2GvDT/7Yj2LmKfPZLpNsRppWzLKSeV6yyHMK4zFV\n1XgfqZY9WgEoibEOXKPGGWvQStA0mkq8qBHCol2IsgInXCOfKwHWUwvZhIhSNdKxONLC4wTeLZN0\nKVDWI6zFC4HwAiGh0DWdrEQFG7yysyCPMuIkROPxuIa8y64Hrw8SfkALCv8G9IZ9LubSMYHSlJOS\nfD6n3elQpwuS/gbZbIbyll4rJugLVloerzXeVOzvOBb5lJfOXyDLDWlR0g5jbFngvMPUlss7O/R7\nfaR26DObeA9KK3QSY+qaaZqBLHnpxfOcfuBerly7wtrmcdprK7R0m3reQxZzdm/eRMqIjeOn2R1N\nEEIwmsyJQ0UYBawP+lRFxfrqAI1hsn8DoSUXX3iWB8/dxXjnGmdObJJN91AqIE1zwiChqCtM7Thz\ncpP73vw+KgdVJWlHLZ47/yKzWcnCWGpreeHaZXrtTlP1t02l3jnHdD6n22qxN53R6a8wnu5j2zEq\nbCG8p51ESKORicC4jK/96q+gXOxjbMCnnnyO08dP8+nf/xO0yqilJJAKYy1BrBAaelQsijHrWpDE\nEWG3TVFbKusw3jW5kpR0ex0qD7N0wXiWMptnTJ0hmxekaYozNZoAhEcRUCmBQ+KkxwiBcArkQU4F\n1tQI3QgTwnNY97FL4UEgD+Xeww64pccoFcSm5njUoS5LnDfEUdiEnixDQDzOWfRym1v2qR0Q6PXs\nDSNLEiQ4b8CWBIEGY1hfX6Nwkmw2xTtLlVXUymKKCm09CsFkd8HNmzvYsqAXJ+S1JQpCAiHJq4JA\nayK1grWWXhHjNhxlXVBVNXldEEaKv/U3vpluoohUgAgkq+fuRpiK6fY1bOWoZYuLzzyHxNHqd3Fi\nxsax49zcfYK704JFWrIzzTBFztrqOr3N49iqxNaa/dGEtZP3kFlHq9fD2RqDJIwSYqepipKVlXUQ\nlnyxz3R3h87KGqEQLBYzrK9J2hHeS9K0pNXtECctAqUIdYC1jZJknQWp2Ng8xs3RlPXhClmVURcF\nzgn2dsY4C3EArdDy5DPn6SSSvHLcde5+puOcvFJ0kh6TRYZUHlRTxNRKEGlHhqW2JUHgkKLGVBla\nJ8RBAzIwUM5wVU0HiQ8km1urYARqq5HpnPeIfoeyqrHGc+36daqq5ub+PsZ5SufQKkAg0DogiDR1\nXTW1I+PwDZMaMi27HBqpV+BNI1AI2fQzBB7a1Zy11Q2m8wIXBI2gZx0yDMA39HLC4ZAYu8zLDljy\nhVpn+cwLL7PS75MVOUJo+r0eNxa7rAzXaHXaqCBoJEknEdag0Fx+6SLTxRxTFggnKPISA5SmxNUG\nddBuEYZ4CWGYUJYl3aSDB77i67+GVr2guPYc9WiC7fRJOn2yzJBlFXkx59Rdd2EShZWah978IOUi\nJQg0PpT8tb/+jcxmEyLVI1zbQCvNtcvXWN88jlYO6wXHJOjOBtP9PbZvXEVjqbKM9Poe3d4qUjj2\nL7zSxNtKsbdznVYUs3riDONpShKGlJnBmJpeK0EjaCUJVVk2nkU1/WC1qVGhoKoqjHP4IARTE7dC\nQh0QxzGidpiioNcJ0brF6tYmm3efJc/nnHaeojxFXRfcO9wErRFYpNA4HLuLlMwY3EFuYgwiiSms\nIIk0ZWFwxmDzmior8QgMoLRGKIcVgvl8Tq/Xo1UvkMZRljX3rg9Iohhx7iylsahOm9oZ6rqmKAr2\nMstoNGI+n1NgMEuICiGbDm0aRfBgCYOXEAqH9I67Vvv0S0WsJNumhAgEAUIH+GWvjcOjvML7RsFj\nubbHefOFW5Rc2zzO2sZxro/G3P/gfSA0QnoWuyPe87V/FYeiLEukqwlbXX7jl/8trnZUHrBQVx6J\nZl6VzEyGshasIYkiglaA8h7rBfc9cDedVoyShnS8hwgcNojQskO728Faj9ZNGLK6dYKyKEiikjhW\nDDfWmRQFLzz9FA8//DCT+U36nTaj+ZTQeBYCdvZ2uef4EGMykBFpWqLTimESQScgDNuM1Zgb4yl3\nRS1sIkiOHUfIGFOmhG1PK+hQ7Ke8+PKLvHD+JbKFoShKkpbCWsF8MieKNIYSgaAw0E0ivLUkcYu+\nqakdxFGLqkqxwqC0QwhDrDu0W22itqQTacrxNZyQXLuyh8lmDAZtdm7sMjz7KEJqJB5rPd3eCtl0\nTtKKKMsC75qQKNKKPC/wxjMdzeglLSSa/nBAhaOqa0zZEFhrjRKi8RBeNB5EgRcO5yvwhmIyxzpP\nGEUksWIYhfjB8cazSKhrCIKY/dmCyWzCjZ1tstpTuoBQauJYcPfmkHPr6wTM2TmfUVeGldUe/Tjg\ncpqCbFqEvPANQZygqFK8dQROIJUkbmuiJOap18HsG0aWbP8a4V2nsPs5virxsqa0Na2VPsY4wjBE\nRwHOOS6cf5mNE6d45cJluko0xUvvwZeoRUa716KsKwgD0jInqi1eSf723/lustm4qZK3Qj74G79L\ny0NVTGh125g0xRrP3mROIDTxtX1WVgd02lvszlLy8ZT+sXXe3nkHs+19AhQCibcwGY3prW9w48YV\nFg+do9tbwwmFDku6BJRpSitMmM/G9OM2Z44pBnHEWFdo2UjnIkjY3b7Jotqh3+nx4NmzXHzhZY7f\nfYayrAkDh80tzhnybIFxNV6HJFFCURZEoWJnZ5uk02tUvHxBuliQFjmBUqyu9sjTGVtqwMrWCQZb\nxxGqwhpF0au4mU+wCKbzHKxvuhyUQiEItKSuDb6ucFogtaSsSoIAfGXxeNY21qnLiqgVk5uqUcKW\nUnGvs+xapvGCUgYEWpPmKQ5HWZZ0Oh2ElQQCQqGwlSVQiiAIqKuaRHmGsaKoRpzqSrbCmHvXzhG2\n2swWOVErIB+PCakh3+fadMrb3/koV6/fJLuxTUvVvO+RcwgZsn5si6qqqKuKXm/AbLoL1mCKCiUg\n7jaS9q++DmbfMLKcOnGaay+/gkKwf22bZNBDRYrxaJ9ea0BepKgwYPvSda5e2iYJWygvCKuUbuCQ\nQYD1llBIsAvaypGnM7pxwuqgRdyJ2H32IwRArRS+Mrz9zBDvHEUaYRAUVjA8dpyHV9YwQrG6MqAo\nUtqdDR547FG8KXHpHH36GFG3w2c++qf0RiEvnD9PS2laKztMp1N+9l/9NKiAtfXjxJ2EbqtDN4np\ndzvc3N3m3Ol7uHZ9h5MPrKOVQiK4ef0VpJNMRiUbGy0W5YTpdEa/38GUOaHSSOEJQ4XQIYgK5zR5\nWjHOxgxW+zhXMxz2CaII4yV6rUtdGqazDGcFvY6mkxiKfMzpc6fx7RbZPKWTtNk4e4a1U6sooekd\nd3jB4RoT6y1VVRFohSkNXglKa9BKESCJkzZFXZLmOU4KQqnx1mPLmkgHCB1iyuogO8fiqeuSQDX1\nEAdIrSnrGlfVBEozn84YDofUGMIwoChyjAyZZhXeVPTaCXE7oaosxWJM4DyLRY5ylihKUEHI1vEV\nLu3sIsKAM2dOUVQZayogaMcUo5t4Z9FAUaZoXxNoSeVKVgcDVBSQlsXrYvYNI8vNosKmls88/Rxf\n8u5HMXlB+/hxstQQ+AXaOV46v8P23h6ddo95uqCzska207SNJ0ELh0AFFdJDOd3j5IkVtHaE7YB4\n2CIJQkya0UoCZnVBO1BUpiLqRXSSDrP5BJgx2RkzWFlntjdGSst0cpMoiCjKkm4SMt1LMSQ89czT\nfNWbH+NLHn0b1jd1gq2NFXrdNvPpmEFvjaoTUxFz7OQxZKRYH80JooitKGJHwu72hGef/SSj7Sv0\nwhZIRdBSaAm93hrloqDbTyCIkVqjowrtPd61sVqS1nu0wpDaFPSSDt5YWkGIEILcBwjm/O4ffYa0\ncHz7X383baUxok0+yymKiqg1YHdvgisW7Fy5TF0XdDeOMTz3dggaRUrKBtQ4jxOCuqoJvUVpTW0K\njGjkV5wlDkOKtKCuKrz3RHGMtwavFGVZo4KgWdwmIa0KwijELolXO4sUAmMtKtBMZtNmf20b6RmJ\nECFeaaoSSlFTVhWtSFOXBZv9AVoGOOOWKx3t8l4DtgklWz1KZ1mMx8ShItByuQTZkVtDJ+6BkoyK\nFDczqPD16fDGVfCDFoMzHU6cPs6izijHC8z2NsnmMUyRcenCK1zbbmoqk3BEmqZY56h1gHUeVdfk\naYk3lsrOgdv7AAAgAElEQVTNOTZoo4erZHnGhRt7nG6vc2U/JdYBKlqBziZBWxOpkNHeHlWS4OuY\nbqtLPt6jHFVsrA/J8jkiiPFeEoUhpjaEUpOOxtx/zxnWWhHhsE2FIxtPaWuN8hWDfps6TwkCx6ef\ne571YZf57pxItahNxVqvS6AiemfuYjjsc+6+b+P8409w5uxZKt/E9z6veObpZ7m5P6YqFuRVRZrO\n8MbhnKMUjo4McNailGY830MFGl2UVFUBImaoauZFQWUj9tOM1SjCAbOsYhAP+fvf98P82I/9Y155\n6iKrK0N2d7cPF48dVMEFYOqaKi+aRXKhwDmLQhNGMUJKRqMxxlhEaWm32qhuj6IoMNYivcM7gVS6\nWedvPWmWEUURvKofSwgOi49JkjQysbVopanKEhk0EK2tQ2pFq5UgXNOZ4ExNVpdoHVLXFVGikAKq\nqkTroOnolo0iZ6ylNgVKNsl9GESk8wVSN10AnWGP+gs1wXdFReH2kVFAL2qh1lcYZTmDVpef+4Vf\n4fT6KtrAmc0t0jRlqzekLCvi7gBjLBbHfJFzZbrDd/6Nb8dkc6wBayxOwu7umNXVCOEE00qSdNfZ\nswWRCPEDzaX9XdqdFfbSlKeffZ7+6jrPX3qBeZbjowGhsChr6LZiBr0uvqpYzKbUmyeZTqdEcUQy\n6BM4wXgyotUfEgZQWIdLC+ajCSpoeplcVROFMQJDnqYMEs1kb5vVYcJ4vEtvMETiKLOUUHn6nYi6\n9nSSkMn1m9x98nRTyU5CqrqizitcbcgWE1yeU4ymSCXQbUU6H2FdySRdkE8W3Kj2Ga622ZuMeOXK\nFd79nvfwod/5Pc4e67I/H+OFYHd/yjm5bJQEnHc471Bao0QTNlmawp81liLLCMMWSXyrEp7n+bLv\nSyGcoygrpAqojUXjUcI3i9+ERgcBpjYYY4h1cNgJkOf5YUW+rmtarVZzAwqtSdMUHehlM6ggCAKU\nUgQ6AiRKglSeKIooyxJnLSLgcBykR8sWsZbUxYKyqukOBqRljpSCRZFi+AItSsZRQrWYs9ZbxVRz\n4qgDVcFv/spv4KzA156IZb+sElhXkyQRxmZEUUhaVHzl+97N+VcuoBc3EbWh1VlhMZ+SLQruPrVO\nXaUE3sFsxMWPP47qb3Dq2CaimDHIUtR0h/H2Lu84cYZRmdPtH8N1FIN77ueTn/4o73r3l2JcSTaZ\n0Vtb5+Rb3kpSWbQpQQWMr1zj+Wef5uTZM3zkk59hOBzQj7rMqHnl5jXarRYXLz7FY29/AKSmzmvC\nKILa4rKapL+OSFMwBmRzhxVlamReEEuJsVAsJlBvoLUky3NCFdCJNLUxtOKAwpQUeyO2TpxoCoam\nRJQLyvmMerJLEGi0lZTpjLVhn4/8yUf5rv/4byL8jOsXR/QGKwgb4K3HO4dXEkFAOl8c6cJt7k9g\naXIZLRWmNoggoK7rpjIuRKOE1TXCG+raE7UCgiAg8I30L6VESYkzFls24yi5XFdjGylXKoX3UNc5\nxlqc9RhjiKII6xxVXXJya4soVHjrKIqSPKtwTlBXOXEc0+l0kFJjKsfq6jrgqB0Ib6mrBa4uUCJi\nNp02LS5aopTFlOXrYvYNI8sLr1xl9/plqk8WRIEECw7F1VFOp9untDURhtneBF8L9Nom+/mMJJTc\n/+DdqDii9ikXLzzL2WPvYJrOWO10iVVEstZl59o1kJ7NzTWqKueBu0/z+598nkHsSIRhrZVgnGcx\nh+4gJPIx2oFFEWDYvnoNU+QUsykrnSG+mFOVKfMipy4rqsIStRKshl435kvffA4vJVjPg/d9KdPJ\nDp1Ys3X3Js6XPPP8czywtYojIQw6JC2F0g4dKbLpHnvjPT715Hmy0R6lV0RBm0gCYcjU1nRabZIw\nIaLGlSVVvWAxn4EWtNcS2qcvcmZrj1ZygQvWc33vvcRxTJbNePKPnmBjY40H3/IIJ+55iNliRqve\nod0Z8tKzz3D6gbfgjSNIFDh/uAamkp7KWJxxBMuGRa01SjfVfi8s1hscoJVu7iugFEVWY7yjTBe0\n220CJE5AhUc6T5nmRHGAxUPYqF95nlPXNdoYlJJ4SvK8RqukKcrqAGErBIJ0NmdhDKUrCXRIFLUx\ntaPX7qA8qCUpZyZjvtjFuaap1DtLHAZAiHGO9fV1jDEYY5sCuapfF7NvGFne/r4vJZts0233CKXk\nIx/+GM++uM1gpUvkK/Zu3oC6IIlCnJWMFxnd48d47C2PsD8eoyWcPPcAX/bOLyZAoFXExcuXWVza\n5k+fe4pyntILYnqrA65cv8yZM2dQwQpXtkfEwmI3W7SCNsnqXaSVpXYeUVtWBn0uX7+BCENKX9DZ\najEb7RK4GGdBhQkYS1nntGmxtz/GO0G/v8o8zwlbAWk2RwURWV6CqgmyOdPLVxhnY2oZcNfZu5hO\nDEVREEcJcStgoxXzdV/3VTz+8T9lvrAYJyknY471YmJfogtPPp8xVYaVjV1OnL7A2rEb9NcvMWjv\nIJ+k+b2C8/BDfxv+dJrz3B//e1Qm5r4H3sRDjzzMv/7lX+b67py/8r4fZf+lq6Sm4tL1m6zf3bSn\nHy4rBsosJy8K4igC75C+WWGptW6SeWOaTmWplsuULUnSFIEtniAIGk+iFLUxQNOi4qynKEo8lnY7\nAecospwsTZFSUhQlg0GvaXC0jtoWCOHBhQjh8cYwzsdoqQjjACmbm0846yiKggBJXRSEYUAnilE6\nYDHPqMsCoQRG3lr4NZ/Pqaqq6UKWniD+Au06lkrT763gipKXXrxAGEWsrK9TlSkyq2jFLUQYkEQB\ni9mCe+4/h0lidJXTDyVxHDO9coEkjJhc3aHdbpG0OiT3neORR9/Gcy++Ql96Ot02IvoKrJJkC0Os\nPe1E022tMJ1sMzGa2XiXOpsym07ZH+0xPHEX7WjIL/7cbxG1FJEKgQy8YdjrE0YBxzePEwpBLDWd\nVoe41yVYGeLzglcuX6elNMPBkMl0B1/VtNpt9sYTtk7ew2xaMFztI0VIEEp8VQKecm+fYrrAC0U2\nyWkFcPJUSn/zRTqrlxhu3aDbv4F6wcLHgf+HhiDPAw/Q/H7BJvDl8M7vvcJbvvun+NgffiPXL72Z\nj376k/zgD/wD/vmP/kum0wUbJ89w8p57Obm2wqkH34IOg1tt9lLgqgrpPaYoUUJSugrvwC8XbjXL\nBWrq2uBUk0Mc3N2llbTI8qxZcg0Ya5u6mfW0um3iKDq8M2aoAtJsgbQerCWKIrz3hFHYFA1VsxwB\nb0kXGVEcLeVnT55ZimIBQBAotAqbNTphSKuVIKVcXpMgiiOEFERR1ISGQUAYhmRZ1oR5cYjQ5nUx\n+4aRBdPo8k8//hQdQvpao3yBtiWuKsjrGik03hQ88uBDTEqLKQqsaxogTJHRjRNsVdEJImxlUIEk\ny3KiKGE0GnHfQ2epswWJjphOJqy3e8wWI+L2KpP9GwS9mDPn7uPiM59mo7XJyolNrrxynl5vjfHO\ndfRbh7Q6Q/AJ4/GYyXyfh+67F49EaEXQ7vEVUY/aVkxnC2pTN6v1kj5Xx3tcHk/QSqADweo9DxKb\nOVlhuXFjh8XlC3RaLaQXrLQ7BIEkaCVMpzN0S/HwF7/I6Qf+gGG+Bx8Ffp2GIE8AJ4F30JDjO8A9\nAvN8ndH14xhXcu93PA/fBckvFbz3Z/5vLj/wNKMPPEo/FvzoP/p+Ll+4RJFPmV0NWOxe4/knHQ9/\n+T3AckGk8yRJo3oZU+Nd0yIShJqiKLC+kYmDMCQ3Od7eSsiVUljcIVEmkwlRElEvFgRSEbQltfBN\n645zpOUcgNby+NpZyrKEyhKFEc5YhGja9ZMowBhLt9fFSAhMgJcWpQTOW+qyxlQ1WkgWQFrnDPoD\nwm4HrEEuhQG4tR5HKkk2y7B1Qenz14XsG0aW55+9wc1LlzFCcd3XlGXJooYo7NFe6xKJkkff9hhx\n6EEr7M6I5194mSvdAFFZWq2Q3dF+M/OYgH63ha88stUnbLfpdRNQIUYG7I5GhFFEZeZYU5NlKbVz\ntGWMMBW2Njifke5c5dhgSF2VxKGk3xtQZAWtREInoswDqrIkShJ8lVFmU3qRAa/J0pLBsEfqJLOn\nP8a9p0/ixbJY5wxVuk3uDCdXV1mJVtl68CtJjUUJSzpZ4OuC69c+zpd87afYPPsRot8r4NuAJ4Ev\noyHHDwOPwtR02b405OKFhO2Pr7P4zTOYPCRImr6rn5p8iO//+WcZ/EIKXw6nv/dZ/vP/6hK7l9Zp\nr30Pw7JicX2MXKTcc89ZXh4ZjHeEUmKXzYUqDFnMJiilm4VgCLKiREqFMDlKAEJjlEBLTSgEZVHj\nfEGgA5RuKuLONe36WisEnsV8ivCglcJY3yxkowGvMQZ0Q7S69pS1xQhPDwneIKIAu7wngtCS2tbN\nTZKsxriajvGEcYRUEaU1BE6QyABbVzhgcURtM8bS6w2JIs3GxibT+U0Uweti9s8lixDip4GvA3a8\n948st60APw+cAS4C3+y9nyz3/dfAfwJY4O9673/ntcZ1piDUimyyh8lynIBhGKE7bUyZ8t53vxMl\nDFaFOA933X2SU2eON+16xuBxIBytrZPsXN1jPBszHu1QFCVbwrB2fIuJdWR5SZ1X3Lx0ldGiINYa\ngafX7cBlSf+U5fq1Paq5pBVpOv0OcZQgdI8qM7RkwGy0YGWjzxOffplzp7aoFhPCVkzpLF6DLGu0\nr9m9dolwuEmaL2h1OxhfYYuSoeqzN9pHoZhe2WbQ6yNMRj0bkxcL2r0XYOUDPHbmCcTPAP8SWAf+\nLtTfJLl8+TivvLzCxadXuflvu4z3LUooijpFKYFwO7SjmLADOgh4/Mkt/ocffyff+h9+lPs+8Un4\nLgh+KeX4z/w4C/lJ5vPvxoYhE1NTFTNq1V82FHr8slPYOku33WW+mBMEAaZq2pGEFWghCJUiL0pC\nFTVRkvcgHGEQHt43zB6EX1oTa01d5dRVTRAEt1ZZ6malibWWdqdDaTImkwlSapyQTVgWxwhXIZUk\n0QFIwWgypj3sN9esBNYI7PIWtJgCZ5v7jM3TlKqq6Pf7dLtdtNYYW1NXDtUwHmccpQxw4v97gv+/\nAT8B/O9Htr0f+KD3/p+J5jcj3w+8XwjxIM1d8x+k+QGjDwkh7vPef9Y60jCQSO/odTpY77GiKYit\n9BMevP9BVFQi8OgwxhYlQUtisxRbQ5JETEe7qFDgQ0G/FxGIhMHqCaTUSAS6spRFzsr6gKKquOfu\nk1yxHc4eWyegJp/NcUoTtvqsrg7odvo8/9wz7OxXBKpABAl9WUI5Z2+cIXe2sbrFr/zeH9NvJdgy\nozaesrLMphOOrw3QYYDuTRFhjxuTlLCTEMcBk9xQ6wif1wRRh6s7u7zp/oxW+1eI7vpVwkuX4R/T\n/Eba1wEfgPm9Xc5/7E2c/+/PUpQdxoVhtkiJAkE7NlTWsdI7RlGmzCdTnEsRRYESAfeePU06Lvi9\nf/PV/Hye8fd+8Qqdn5vDl0Pnx/+Q9S+LeOoP3s1iP2P9WJdga526rklo4nwpoK7qwyW3ZVmig4BQ\nxFRVhUWSGUsYJ7j/l7k3D9LsOss8f+ecu99v/3LPyqpSLS7tsmRZ1uJFljHGxvLaZm+zDTQ2M21o\nNx1shjbDGHowMDN0QzQNTZg2GJsGjA0YY2NZlmVLtnZLJZWqpNpyX779u/s9Z/64qWJrNBNMMPL9\nJzO+jMjIijrnnvO+7/P8nkIgVPXf+1yxDNDv9/9Okf+c9VfISimNAMeuagYv8NEC4iJDGEOr1UIp\nG40kx5A6CjNJqTs2msoW7HseotBIIQg8F61slKlOoCyKq2uksi7NZJ4bhCZJQpLGSGEzmUz2ayPo\n64J2y/3/tlmMMfcIIQ7/vY/fBLxq//sPAZ/f3zBvBj5iqni8c0KIM1Q36/v+/u9td5vsrq1hypxG\nK6TRaTG7MAdSM9g+h70yixd2ENLCiJIij/BsmzjPSOIJruth6xKRDjFpjCc12BVLV2YlZZZQU5I4\nnxJISTHdQTVbOKFFf3MTzxGgNdFgk8X5FkVecv3lh9jY3eP0qae46oYbsXsDAq9kNgjpbW9y4sgM\ncVRHKpeNnRTLslg+djmt2RZ5OsTyQ0ph73ePNNK1KKcxsdZ4UUY6HuEFexys3YWaex/1z0bwfwFP\nAP8KOAk9c4ST917B2l8sMJmOieIx61vPMp6kBLU6ds2n0ODXm6yurpOXhlroYNnQDP3KV6OHONJi\nfWuThcW38Qe/OebOtz/I/E1fhNfDzNkvQHElswuHMXpI03eROPuwvxJNNccpyoIsyxECsjy/5Dkp\nigzH80jijCxO8WveJQ99luc4tk2z2axsw/u24jTLyNMU3/Ww7ArCoXWJ8hxKKTBCVizkNEPaVQOh\nAk2KSoGtLIwxZEWONgbLql6KeZZj1xRFCcr1SeIxJYZpHCH3kwOkkEynEVqXJEmC67mUeY5l20ip\nSLOEuU6XwHn+vfBPrVnmjTFb+99vUfVgAJb+3sb4RyPyap06h44ewlOS+fk2tusyHI8ZTSOczgq7\nw4Th2iqFFviiZOPCOus7e8yHit1BH7/mY6KMa258MVmeUGpNYLv4zZB6q0mSarSQeIEPqkQVLjXX\nx2iJF9bo7+7Q7S5iWSVZUpBmKUIpVo6doDm7yJlTT3Gg7jONMpr1kAu7Q+YWFsjSMXVHMuMNyI1B\nizmMsfFqIeNpBJMxpYwZrO2w1F0k0wZqLuPhp1k8ejeh8yDid6iuWjPAe0C/xWZ340ZY/2Ye+cqE\n0daAzdULTM0Yzwow0sJr1DhzbpVGI2D5wDyD3h7CCFaWlyiyiOlwQG1hgZ2NDbr1HBXWcYzLxoU1\nmkdm+fSHXsa3/eQOzg2nEL+X0j5xP1/65IRGu8RuHWFF2BigFJVjMCsKjBHYjlsZpIqcNC9Qjo00\nimgyxXEc3KDKOcnS9JKHXu4DLMq8Op2ElFiui205QEngOZUCOM1BSISmkvFLsJVLUez7702JKEpC\np7raFfv6GCUltmWT7evRxuMxSimiKMVxHLACBoM9lmc7WMpGG41UFpNpjhCKvND4tkOuS7TRlAKS\nqA/a+2fZLJceY4wRz5/k9T/8mW85dBsBZRoTTwZMRjDOck4++SQ3XXscvfssxxoBcVpikoz2Qp1X\nvuF1bD79DPNHDxInCVZREgYBu4M+ruuRJzFtL2AqNa3DS6Rb25zd2uDi2hpRknHzqw9wz+fvQScR\np06dIpnG9EYjSm2YbbWQClZWDuP5NXa2VxnVBEutJgab4y++ls/edS8vffGLcX1Fo9VkFKXUOwex\nggCdFVi2xdQuqfk+C8sh2XRCbqe0Vj5Al/vg31Ndtd4AfATSqzsMzr2Kvc/fRN3tcu8Xv8BMs8ve\nzg5JWeAHNT77pQe47vqXce70WRItMLEhnCY0gpCnnzlDu9shmU5RSmJJmyyK8JSkFtZZG/cJZYi1\nO6UTQ779XTjvfR/8L3DNw08xOvWNvOiW2+j1BUrGSOpII/c9LQVpmu/XHTayLCreWJqAFDiOg9aV\nlaIo9aUaxLbtSyCIv03AFFSyH8uyGMdZpW4WFmR5lQig1D4I8G+gFs9d3fK8qiVsqyLel2WJbVcG\nt0taMsuqsK37MxRLOaAlJVXnLUljlFL7BNFKr2Z7Lsq28IIATXmpU/aPPf/UzbIlhFgwxmwKIRaB\n7f3P/35E3oH9z/7B84u/8sskkzGurbjp2st5xctvYzaYwUwnZLtbdBybbDhGktPSgrW9XYZPO3iN\n+SpUKM1whUUy6dNQkulkgK8syiJh2N9jaeYExi5ZWZ7j8LEjFMrC9Sxuuu0GiHNueeXL0VlKmWR8\n9KMf4dbrrqIZCLJpSVkIbjl+PZP+JjJLsQRYrofrddkbZ6RIRmtDsnxCp7RIC43tuGzFGdHmGru9\nbdLRmMNHQl76xo9hf2wNfgJ4N3ASJvI4/WdfTnzXNTh2iC9KRr09rr3iSu770iM8fXEdu95Cbycs\nL13OIw89Sac7Q6dVI40mTEYp2oQMpyVrF7bpdhugLbY2t/EshckS+hurzB1eocxdkiyns3iE+Hfr\neD/SQDkjrL8e4C7cRWfxrXSPeUwHO9R875KDUwiJ7/nV/X4/HCrOU9J9GftzCytNU+KkupJeGvD9\nrRZtURQVJ0wIpKhIKrmufDNBWKeIJpiyRBsQ0pBkKYUuCcMQozWW66KMJCuy/Y1gcBwL0DiOx3Q6\nJU1TkiQh8CsPTVmWNFotUlHVR0qp6vqn/lb9pHMcy+LUxU1Or24BBv3PJKT8BPDdwH/Y//rxv/X5\n7wshfoXq+nWcajrwD573/ezPsLN6HpNOEGnC2vmL1LtzzHYXCP15TJnjOJoyH5IXmgPHj7C2t0t3\npUOW5gipiLOMLJ1SRim+E6ApSFVJtzmLTjSW4+KUBUWRYJeKUgskGmXZ5CKByZDCDnGVYanu4LoW\nZWDIoxIz7dHwPHINTpKwlU3wZ33mOgHTYkRroYaFQ+BkZHk1eLu87hLLLlvdJk13m+Ov/E9Yv7YF\nvwl8Eaa125g8ezvxuEU8SdF5hPQLmo0a6aTkwYce4dlnzlGoEFPYXDh3Fs/3OXrgIFIbdJ5jOS6J\n1mxubNIMmuxuD1jdqCLtzPaEm68+QTHqYQcOXRXwxLl17FrAV6OLTM6lnPjjyzj03kfhl+Hof3uQ\n0dZ9pIMhsnEL4ewhcjPFLiVZmmNZbvV2FjnTOGcwGSEshWVbxPuJalJUC/BvM8aeawo8dzrkeY7Z\n54FRVsldjlJVk6Q0GA2uLYiziCyrrmBaiGriH08oHUmuBTqrrMXPXaXiZIKUEsetVdN9BFmakufV\nKMJxHCzHJt/XlqWFpkwTijJHYTOY9lmeb7E0VyMI6ziOy6fvffifvlmEEB+hKuZnhBAXgZ8BfhH4\nmBDi+9lvHQMYY04KIT4GnAQK4N3mH2FiKiFo1GqYwEMbQXvlCGlmCDozjJKYr33tMVxjCB3J0pGj\nGAP20Q7Prq2hjaKYjkniiKxIiHsjLK0ZTAbk0pDkOQePHGY8GtGp1cC2mJ+fw7U94jSlPTeLVApP\n2hRSMXPoIMYLkZ6PHTax45w0HeJKAXaEZRS1JOXY0TnUxi41z8LRDlorZKlpuDaZKUjTBF9JWnMD\njlz3QayfGsFdYL4I0/RfMj11G3k2JRv2SbOS2cUVppMxvV6PNCuQ0mJ3PMWfm2c8TVhotzjUbmAh\nsZTCkpKNyYSkVNStBlkBpZEMplMmuwOc0GJnEJH2hyz7s0RxzNz8HOvbG9AOGN1ykFHtevQ3vRf5\nEwmz2xE7jQdwWt9E4/AiAgtHN9jbexLLqlBHVXerBCNwPQ+kRGuDZYlLgz1rv9NkWRZlWf4diJ5t\n21WEnWWBEEhTpbHpsiTPC5SWaF2i04IiL3GVoNAlaRzhWi4IG0UOto0xNrosKpyv1riOV51kCoq8\npCyqE+6508OyLIosx3VdommlKBiOxtRqNdKsrJLXlERKh6IoCIPg+ffC/xPf9Z/jEUIYXcYMV89j\nZSmMEvrThNm5BYbRlMxolq+8nI999A+4/eaX0Wp3cFpNSm0QxYTdzS3aS8skuzvI0OOJxx/l2oMH\n2Tm/Rnd+jixOSOIUTWVGsj2PAkMwu8JgMqAzP08+HOBZFqMkQQXg5ilJVrC+tcVWb8iffOSP+KHv\nejuD/iabmz3ml5Y5/uIbiNcvEnYDkmlEnpWcO3eB+bkugbBxHZuMBzl4068jv38Kq2A+rijGP0qy\nfR3RpIfQGckwwarViErBeDjGQTIcRvR7PU6e3eLMxSE6zbhuaYZZYVBKkJfV3KJXVLSYcZKSFJJM\nS3I0UZ6yO5kgZMmVJw5h8oQjx48ziEvWtjYJLMHc0UPIQvNt3/EIs390D3wNdj94gKDxU3ju1aT5\nE1j2zVhcxXu+63XoMqyEkiaGDCp9cY7WEmMqRbgx5lKd8ly98FxNoZQizyuKp1LV9N93XISprmel\nMTiWTTydkueaoijptHwyUyJKUK6DMJKGo0m1RjgBQpeXJPyWqmqnNE3xPI/ppJrAh2EIcEnS8tzp\n5thVu7j6uyqJURD41b+qyHGl4P0f+hTm/8e04v93j84ok4TQc5nqAb6rGeyt02i1kE7Ao/fdy0tv\nvYX2jIdROSURBWAJzezSHMmkh7AraEGrVkNmCYutGr3+Dk3Px7EM0lKkOmK8u4vje/RHE4pGnd5a\nQle47Ay3qQUu480BZWHYm044sLiMU2re9Na3cvjgMvJgh+2jGTXXplQ5K/PzDIsxxDmzMwv4s4sk\nkxGz7Q65OcnizP+BeHMOAZi/cMl230e5e5Q02sMiZ5pOyF0XaQxxf8yoP8APWxU9MzGsnT/Hq152\nB89+7XHmlMaxAqRy0HmK8iXBNCUuNZ5SFdKnKMlNxfZqaMlUJzx59jxvvOOVbO9sULgBM50OcTKm\n2Iu45YpjpGoO8wP3II7CzC+s0l/7GCb7YeTxl2B4LYiYO77xdfzln92FVFWB7DoKU5TkxkLKKoPS\ntu1K/rJPyTd6n2rB3yBSn9tEw9FwXzE8JfQDhJKUeY6xBBNTEoQ1PNvDrYfUHKeq86VkvLtNw3EY\n5THCU1CoS9P+PE1AV63tIsv2vTiGLMtwHIe8KCjLstq8loXjOJdkOFKC4yoEFXfZSPCt5xdSPv9P\n/xkfI10ef/Ik58+dZ3Vnj+3+lAsXLvDAQw+wtrnBuTOnmZ9pkE4MOndJI115H4SiSKa4noMlBMIT\nPPLlr6CNYiJtWgsr5CjcWkgmBV6nje36BLZHEIacP/sUykzQVkyNHLucslBvMuxtI3REOt5gZVai\n8hHGWMRRRtcCS08xo13Wp5sUwwHdwCEe7RD3Npn21xjvnMep/R7i3+cwA/ojTYZrH6CIr6WQGcay\niDOBwcd2HMajKXs7fZqteYosJ5sM+fL9X2Y0mfLs1x5kIbCo+3XqnoNnW7hOiMlcPLeGbfnY0gVj\ncK9jDJAAACAASURBVKTAERILm9Cx6XgOs7U6m9tbeH4NM53ikRNQgTrEYo2vfXmPZ9YX4J3Ar8GF\n7a+yvvkyPPs3yJNlzjzwAOfOf41vfv3rGPa2mMSaX/zPv8T1Ny/zgz/8vXhuFQSUZdULyfU9vMBH\nOjaWbe2zjwV5XiKEosgMwmpihbO0Fy/DbS+iarN4nWXcxhIHj1zLkeNXc/T4lRw6+iKWVo5wzY0v\n4xWveB2+71J6CmEFZHFEEqXkaYEwEj9oIVVIrTZLszVDp9vF891q6i8FxrLItaHV7iCVTRSnCGmh\njUBZHkK4GAFJFpNkMeP4+T34L9g1rCgiTn7xbjrNJtrSLC4ssDfos73d4/LjV5AXKVYQYJQk2dsF\nFOc2VtFxyZXXXkGeF2SpxvFt1lcvoDAMRxP8epOH7v0yaTIlyguMnjIexVBoGq0Zzl94hrrv0W62\n6LRqDPoDZrozXH/l1UgFSZ5gezaPnXmWm295GXPtEKEzKEqU8ZBKM9zcRLk+VhiQuAHba89SQ7Nw\n47tRVwzh4zD2P0AZX4mSJTYp0XhKHieURcFkPEGXklhrTA6j3i7bqxt0Fw7x7KmnGO/ucmRmloa0\ncd0mqTZo6RJnmjjPiYqCSZIxTiIyNIWwSIqCrMjIdEat6SEdw8rKCjs7OziuzdzMLMPREMe3eeXr\nb2Gw8yVuv/6/wY1gzkti6xHiZAY/qGPFG/zRr/8kz5ySrO6sklk2B1+k+KEffTu/8NP/BZl36PUT\nHLdBXiZ4XkBZ5sRxBWpH/w2hXghB4PqsvOg6JtMI3/MIgoC5uTnCMCTXhu3tbUb9AXvbO4yTceUv\n0XDZylFGu6ep25pSu2T5CGGqrptlWVhWpT7PsuySPSDLMlzXpTCacZJQZjlKc0mC47rVlN52HWwl\n0ZQoS5BRYNKcD/zOn3/9XcOkVHS7MzRsh3S0C5sXmW732dvaYcOGmZl5onGffNIjnWpmlpa4+sgJ\n/vpzd6H1cfx2AzeH4eYGBxeXSZIJM50W0nNZeMedeI1FPv3hD3Hnv/x2tFKgS0yRk0UxKolJTEa9\nXieJcx594DGWrr4Opx1gpCEpE5auuBpdpJSWwHFCEgHSqdO/cAbHs1ACcmXRqNdxlw+heQh1ZliZ\n2C5vYJ++DqMjPM+lTDOEspiMI0K/hi6qqLZaGPDUY4+jdAZ5zPraWcYbGzR9BxdD4NpYlkagmGYJ\naKoiVlgoCb5tI7Um1uAoh7LQFUlyMqW70KIeevjOApPJmN7uFvPzi0jl8MgXHyIrU2684wC1a1cR\nX9SUr/4s7cY7ePJzdzF+4mluvuEmFg85aGvKH/7xF3jTnd/C//a+X2W0G/C6b3gJn7v7XsoyYzqN\nSeIq9NTz/Ap5LPeL6/22caYzglqTzswyn/vrzyKVQUoL3/ept9p0uh267VkOrlyG4zso6WKpSkb/\nl594klQWpFmJkAazP4jMsowoqhoQz7WO0zQlDMNKkmP2WWS1OjayktjsP0IIpmlM6doouX/jUy7S\nff4R/gtXs2DY29vD7nYJO3WGm+epd9pYvT4hFmlvhyIrECZjPOixdHCZ4eY6ridJsjEOLgUGzxEk\nUR9fgLI08aSH64foUcqxq45WIAYypBIUlsbKC4o0xq+5FPkUR7pITeUjKRLCWoApMoQtiVKN8D0m\ncYo10XidgPP3PUK3ZqM8RaPZZev0UwhLMXP8s/BJ4E7Qk5soihLbrvLmjZYkUYrvh5QG3KCJpQSD\n3i6BUliWTWwKSA2WUNi2j0ZilEBIg9IGS0mkyPbjvEscS1CWFmmZIXQJWiBlRW5ECOp+yGBvl1qt\nVnUdy4I0mQI5tXaTNFGM1y+ndvsq3AXBNzxMLN7F0dtexU60zd5uQnd+yGOrJ7nxlps4v36aN77x\n+/jkH36c615ygi/ddz9RBEoJLGUBijzfn7xLfSmd2XXtSgLjugRhg2uuvZ6DRw5UamNjkMrGUjYW\nVfxflKX0BwMs5bAwM8skisDWxKnBlDFKW/uninVpQz4XHRGG4aU6qSgLLLfSngW2u++IrAx3Wmuc\n0GcynRIELkIbBIZOWH/eFfvChRkh2O6PWDt/FkPKfLdJtBPzxOo2o8GUcbLHXGsOTxuQFqfPXkDU\napx55hm6y4tsbfaxbYdmrYFQqlLERgXKb6MzQdBt00vOkWNQykGXOVaak+c5rudCqTFKkZRjnE5A\n+8hB8uEumSkp8hSZCaLdHl7ZIVSSrO6jA0F3tksoNEZpPNvCOC55VuB3Hqw2y89APnk5pRb4tosp\nM5J0ShB4aLtgMhwiRQlGMR4PyIoU43qYZoeNJ07h7tNbMNUU3QiX0pQUGhBVJHiZSxQ50s5xUIjC\nIVcpRudoNIXWjMcT/LpPuK/cne12SJIUpEGnGZ7ncOFCm8XbgX8HZfmXEK0jnRDvpR6zcUBgH2dA\nxnnj8arXfBNfveczvOL229nY65OXOUoF+F6JEA5aVAu20EUF+M6rYeB4PEZIiRIOllI0Wz6j/oCH\nH36Y0WhMvz+oAOK6cjpKKcEWfOd3fA8zCwvY0kOZFJMOUI6DsQxpFmHbNQpdYEp9afCYF+JS29qy\nLLJx9Xv7UbwvkbEIAh/XdSAvMI6D2V+LYc0nV1+n3DCNRRk0uPnmW7FIMUVBWmRcc/NtmFgTy5TQ\nq6HjEbUg5MyZsxw9djkH5pcrtI3WuJZDb5ry1FcfZGGmgbYV42hKbgR5CYOddT71zGlSk2P7Hmla\n4rkB9WYdipJElzRaNbY2+jz4yO8wUwuI8xTbdasA1WxKkeXYYQhCcfkVx7kYj4n6u9UGunCesBbi\n+0MOpBfhcTCvUKSnr8ZzbfIiJopGOFJiihLQhIHHxtpFilwTeg67usAG4t6Quu0RRTHSC8m1ItE2\neZFWs4oiIlCSQpYYYVMYiWeg0AWZr5hEE4YxaJ3R8C3yaEqjWSOaTKm7LpbjEgQB69s71BoNRmnC\n00/VeOkPCuQTBifdYW/4CR6/7yR3vPHHmMpTjLbWmGsvE2WbPHTq83RWcu648w4GW1OeevwpHnz4\nLEJXchMjn6tTKgm8slSVVY+hLEC51U3AGEWW5Tz99NNIaUiS4lLg1nPSfYTAdR10UTkhpSqxbA+h\nbNJ4ikQyHE9xQ4+i0BQajBTVcNrzMQKywmBbVavYsixMmWFEgdYCbMMkqSCAUlaQ9SyNcd2v0+Qv\nYTQd3yXb2sBxFDKoY2uDSXbQSck4KfAbGjNaI9otOFQPEaM1pPTIikpbhBDMtEJuec3LKXc3cS0X\n3a6jPQ+RFKTjHq5fww08SiEoioJHH32Sm1/+SrI0QcsSdIooJWVuKLIYv91BeAHJeMh9n/0zbn/V\n7QwTqLsOcb+He+AItetvQNsetrRozC2Sx79aaRdeA5rr8fwOioIiNzi2hP1pcjROEabEsgz93oAr\nLr+aYTRhOozob+7Q7XSJopxEWERJiokyGg0fPylxEfjGEKiSRFdTb3SJcm2eOX2WwaTPgYOXYWTI\noL9BPZxhMh4z02ljScV0PEK6DosLi2zv9bHDEEc7DPrzdF66CfdC9zWC177jpxgNHqe3u4PvZHRb\nM6jWDjurG8y7Nltn7+eZZxK+/93fy/AX/zOPP/IExgQYqfblLCVCaKTtkJcFUkCSZFX7G4Vj+8RR\nVmVSKonWBcbIv0nvMgZTQBAEDPp7uBaIUpMagchLJArHdUmLnCwvybIc27Yo8xKFIIkiJFRBtWWM\nlBJLSdJS4YYNkqTK8RGWtT8slXjaRroWyv46BVYYCmY6NZwyYZQUNBoOBTFlqcmk4sLWDgcOHQcd\nM+ztUu/MM44ThFMyGQ7pLs4hACMjfNejXxQo26XMC9wQlGfhqRZxkZDqCmRg+3WUFkzTiHxavbE8\nrVGU6DRHKAdTxGRRhm27zHRnSdIJbn0OS0I9VHi6QZEVODLFwjB84iHCE5+7VK8Uo5soshhkiabE\nUorRZEI9rFEWKXmcMtNdpN8b09/rIfbZwvk0IpUOhRA8e/5pjh09Dq7HudWLbKyusjQ7w/Glebqh\nh2U7jKYJwyJiZ2fM4nyX2U4L3xJYtmHx0DJb4ymOq1mam6coMoJajRxNmka06yFYDr6wGJxfpnP7\nJnwe+jf8PmsTybXLVzLyXCbZFi2vyWxnnsXlN+BKTZ4+w4d/63fpdhq85R1v4+SjT6GpgoTyPK86\nS0VGMS0RusT1fSamxKAoC4OwFJPJpAp8NVQnUW4uXZ+EMGgtENJFGIs4irGVhXDc/agJSZwXSMuu\nVA2WizaG0PcpiqqIF44iH8cIy3D2/CbGknzjG95E0AyRumTYG7J97iTC5JjSJjM5Uit09HVqKxZS\ncnFtg0efOcXswQOUO1vUgxqjvT6nTj5Nvdnlw088QdMu6XZnObfRww9DxqUm9H1G01X6owF21fIn\nn8Q06g2Wl5dJplMsz8JCUmu3K4KlqVBLw3GMr23cboMsGoJoIixw7AiRlKxe2ODAkcsAyeLyAcLO\nLLgBJopJ0oTclAjpMMkyMBkxE5rWw/AZ4D/BdPMEvlVQFhnCpIyGPUCjy5xpPCWwfAYTzUK3y8UL\nF7BrdVKdo3UBtqRdSo4eO8a0LNja3GWwPaJR79AMArwixfdaRIXFRBf0CsH2MIO4x9ZggDRwZLHN\nfLtOYFn0enuMJmOm0zFz8zN4tl3JUcoUx/bIc9g4f4Ajtz8IPwbhT5/h6rlrmI6mWLV1uo1F+ju7\n+GGb3clHOdB6F7t7d9PtNvnVX/nfSeN5hlNJEJQV1E5WFMnnGGF5WZKVBl2AJWXFJbMUu4NeZcxC\nYHTO3xGmiwIlfJSwiSZjarVG5akpC4QCaSkcqbBsRZ7uJyYbgy5zPGGRTCaUsSDPM1JbMMwlk7jg\nQx/9JO985zu56+4vUeZNpFqhMJIkjStOmSnB1J53zb5wrWMDS4sLzEiDcmx602oKfs31tzJ/+Aqs\nomBleQkpDFmeMRmNUbaknkY4aUaZ5Ry68nLiGLSo7Ku2NJBlZBgCy+crjzzO4SNH0dMRw+GAsBVy\naKnLk49/BZuSYZLT7/c5v7rJkYMHuerEURbm5kn2+jgzHVzPZZKVJKM+RBHpVKO1wQmrt+jW+VVq\nM19h6Z4CroAsXMJmcf/+a5HFGbUgIE/2mVj7MeUIh+HODiqZYKTEMoblmS79/oAZ32USpTy22ifC\n43xq4cSa3qRHMV+jVZ8hkDZpaXhms880MewN9khNgVtrsXNum6vyjINNjyKOiUYD5ue7ZEmE4zSJ\n0hjHdkAYCpmx21tBv0kgTxpq9Di/+RSbF9uUfJXOzJXMtFewpE9oTynV3Xz5gTO87Xt+gg/9yd2U\n0xLpxvSnE7I0545XvZpnzpzB1ZKCgrd827eyvr3FxuouwiuhNMhUMhjsIdAgKj6BpgoYMkWJJQy6\nSJFkxNNtsjQFZVHkOUWhUZbEcVymaZV1WW+0qAc+SigKpRCtgu2tbYzjsba5QWk7SAF13+GPPvph\nYtpMrBlkqVFWiquqDKBS1iik/7xr9oVrHRuoh3XqSwak5jJ7FlyHaDggigbM+xb9rXM4YYcwDPCU\noZhOmExiThw7zGTtPCYaUxMBWFVme5lEiDzDkg7jpE+tXWe2FpAWEbOXzSGkjTAe02iC50sWswJ1\nfIWjBw+zub6Kb2LynVWUBl1MCFVOcmGTRmMGU+bU2g0+d9dnuPWGqymjjJW5BWorm/BB4A0gopvJ\nkxhLSNzAIctzsjLDD0I2Lq5hKYljKTZWz5H0+tiOC0LiSMG416PuhRjH4cz6gNO9MVM9JrIlB+sz\nmLIgEg36cUbDc8knEY5fZ5SPWWnO4dkKneV4nRauzJFCU6/7jEcjpKrmHoFfw5EORVYAEaXWlDKv\nuFymCgYKRIvrrq8x2v4mTm2vs7b9WU4c1zTCOxHl23jrWxcpCskb3/x2yCb82R9/GkGXRqvNJIrp\nzM5xy6238ZY3v5nNjU2OXXYE2WxSAFILRoMxGxvnGfW3SZOEWt2nzHUV0RcKLATKdvjI7/9XkiRG\nGU2elaRpjO16lMWYJImrFjmGIj2LNiUlhlKaKnXYrrwxQRAgCgedpJTCIR6NyURBHnaQ6YAkPsu0\nmOCKDuHci5Bq/nmX7As4ZxH4YUhvd51GMyCXhng4wAltovEIp7mEKyVxOkV5FtGwT2umQTnagyzD\nCxokRYrvC6bJuErwxTCcTmjNzeMJn0cffRgzP4vrWQgjSOMEowTSUUTDAYHrUo4TpDFQChphGy0k\ncRxhsowszwjCJvk0JkmmuH6bdBhDlNJuBmRG4tXOw6PAeyDuH6LMSvBtMq1RtoUkZ3d9HUSK79ZI\n+0PEtEdYD8hLQ5Jn7G5uEbjV3GCcaGzXZrHpgvJQOfiWTT0IaNk2geNQkzYdt0GjLJiqFJ2V2EZS\ncwOkEgSeg3AMrp5gCVBSUOQpUTrB932EURhhcB3DzEIf8RBwJWj/GI6cJS3vQ7TPYG29hKsv/2Ge\nufCHzBx/EXlxhq986fN89r8/y8Mn/4qffP9v830/9O942+vfBJHiLd/yrdz08pdhSYtnvvog/a0N\nnn3sQSy/wVW3voTZlUM0uzX+9Xvfw2c/9xl62ztkeUpelqBckjynoVxKU2CkxnZsiiLHCzucuPJq\n7njN7Rw8dJhGo1mhXaOcxx97jI//wcfI4pjSKkiydB8ILlFGIRR4jl0xyLwOQqUIZsnjU3TqNYQK\n0KVNbHukUe95V+wL1zqWhiiJ+dpjXyPSGuFIRAEGi2uuvZYvP/AYe+MejuVhJTlt32YUDdgYRdw1\n/SsC1+PAoYNsbG4gqdqOZVni2YoCSZompEXO3k6fLMuohSEokPUWpizphD6TYQ/XcomE5OL2Lhd7\n6ywuLpHlBZ6ARlijjBJmZ7toS2BFU6K05PzmNnZPIl2P+rEz1Wa5DrZPd2n6YRX4U2QoISqJh+OQ\nxCM812EYb+O5DhfXt/FqdQ5fdoLH7v0qTVGxhAMjOVbzOOw3kZbPZhQTZyWhUnSDgKbroArDnF9n\nNR3TD1zkfgovQuKqKr1LFxqDIU9ThK5hW4rpdEJ3pkt/OEDaksDzWDy8B18AboHx2CEIIRq9juHw\nVvzml5lEj3DNkR+gF32eh+9zueHKa/gz+de8+CXXcGHtr/iRd38v49jjt3/jV6n7Fs888AQLBw7z\nyFef4PKrLqfdnCeKRky3Nlk6cLCKtXMtPvzRj/Ctr7+T8VgglaE0BZ4rMeQo6XLDS2/if/6R93DD\nDdexvbbL5uYuYd0lnozQGrxGgwPXHOS2V9/GW97xNn7q3/4o/e0dkiSuWMrGkBSaNEv2ffclaWIh\nAlFdzTDkSYEWGbYUSGETpbvPu2ZfwGuYIQhDbnvFyzGOzdbaOsp1iKMEV8MNN72E1soK26ubdKRi\n9WsPckDUuf3otTRnOuzt7mEpyWDYw/d9zp8/z7XXXs1ge5Pu3AJr586ydNkxTFGQxDFBvcbm+gbL\n11xHtLmFKWIcx2KaZOR5xs0oaq6HNDm725sUtZDRbp/55Tlqto0V+hSm4I3f/GqC0EfZNrrYRPUm\nUIBe8GlvX4UkR6d7FMZQxinxaEKt7jPTPcBkNGYSjdnb2yao19HG4cL5DUSh8FTF0ko9QZkLHCqq\nfNdTaNelGXjMhYpQ2QgjadkWQVEwW/OJ8wzX2LiWrGTrtkCJDEdWYIZoMsGyLZqdThUeJBWYEksZ\nmnNnK4jfO0CWLdb7P839f9ngtd/8fp6636N2YoNx82eI4n/B67/xRi5u3M3PfvDHsYqX8O7v+35E\nnqNwuedzf0HLdrFVh2n0BQbxmMF4xDfccQcP3ns3t7deznQwJuh0QQmCsM788gHyC+tEcYE0JXU7\n5Of+w09x/Y238Sd/+Bf82q98mNHmB4l7MY3QZmV2CSFckIqw0eTf/Py7+MpXnuCm21/Mb/3Bh/nU\nH3+SD/3mbzHZG1TRF0Lh2IrJJKLlV4ANXVoo3yBtG2MS0jyhFAJPuUj9ddo6BoMd+JQ9Q2BpvG6N\ncZpx8PACWZrRiwZY2SzRsIeHZj60MUmCG0+J+wWhkigKROgwHOyxdv40x1dm8UTKyS/dzdLMDFun\nT+L6LllWcO7JbQ4cPIQe96h7kr31Xdx6DVva1GxDno9BgownzNRgyyRsrj1LaGaxXJuNx3c5sLSA\n53rsXZxUlMb6A8xHwLWgk8twTE6Rj0jjAeQl2TSpUE+UjPpDLGOQomBuYYHt3hjPr/Hglx7C1Qql\nClypCU2DXFmMswpAV1MSJwio2TYtXxGbDN+r4dcsOp6DFjmJE2BrG8eRWLZAibxSLNg2WVESRRGd\nTotkOoVmC991KcixpKGzuFltll+BB+4/iNM+w513LlE6f8GVVwUE4TLRsMNly7cTT1KeOHmOV7Vf\nxc994NX85M//GN/2lqdxsoQZF5I4YmsvxSoyRibm5OlHecu33skbv/NbcJXA9lzS/Vht17X5nh/8\nAa5+0VXcf/+jfPpPv4CnA37/P36CzdcnfOJ3P8UwmXLZXJuZ0GG5USMe94mEBcIingz4jZ/9Db7v\nve9ivJUSznm87q1v5o3/4u284RV3oCcRgsrFqaSLkAohBbZVq0YJSpCmVeCsUjZCKyz5D4hdf+d5\nwST6ICvYgVejRGK12oSNJrs7Q3B8lDZMd4f0V1cJPI/2/Cy1Tgur4TPII6TnMNkbUcYlgarTmT9E\ne/4gW7tjVo4cY1oI2s0WltcE26XmW6xvbSBDj1wZbNchTkvKQjPJDUkpMHnKaDJiZzAgxyLJc5rt\nGfK8pNFyObN6lmcvXkCWgovnNyjdZyti5HWQDJfJ0gRyjSM8TJbQbLUosJiOKoXCaDiiWZsnLW08\nO2Tn4gVMNqbVDip1rjG0XMVyu8WJ5VmOzXZoeiG+coknEfFkikDQCAJqrs9MrUbbcSmnIzoyoW1p\nahI8YXCkRKclli3wfRvlOEhKjCgYD4bUwgDt97A2K4lHseTwkjuu48rjLqsX/orE+i+06l1stctw\n9BnKdMSF1ftZmGszmvwy7/qB7+bI4dfiWjYvv/E60kKztbWHTnMG0RhtC4LQ5Xd+6YM88Ocfp7u4\nQCHKKlZDC/aylNe++nU8+eV7+ev/+GEOOjE/+uP/Bk+2efhPPsfR+RbdehPHCLTJyYqUKBvj6Byn\nNETRgPOnvobVkTxx3+NQGozS5MWIT979KV5y222YTOBaHmWpkfhYJqIgwJQThEipegQ2UgtSYUH5\ndZrPIoTAoIjznL3hENf3sRQ4M4t89BN/SjIecvOrX8v8scv4wqMPce1V1+LPHWc4HCJcj7u/dD/z\n7Ta9eEqr1qK7fIDHTp9D1WYoZhYpG12+cuosxy47iL24TJFNKIdDdp+9QJZEkERgO5w58xhPnT6P\n2whYnKlzoN1GhBZt2+XG226jVgsIux2UG1BsbVDGKQ+ffIrLjx8hXPp4Va+8GnR0GKVk5VnfV9ti\nDLbt7DN1LcIwJO4NadQDelFCUeQszc0jBxE15eAJG08KbAeUYyOERpfQH03I4jGH51dwQp/drfN4\nXhPfmjLc2qAZtpA6wtFguS6FlOSlRTTOULZGSg+ExFJO5Rj0LaaDCS863qtOlVtgY82jMX+WJx+Z\nY+XgS8k3FzGqD0HEiYPv5f4n3s6xA2+iVQoefyinOTvm//z1/wnlT5ibDRkNK39+VqSUXsDG9laV\nqbk3Qjx9ils3t5i77AhGSKQx1EKXydMXuf2d38LxK17Mf/3pn+PDP/2vObLyai48/ShBWbAcCp7Z\n6dG2BbljU0YpJrBBCaIswZDywff9r2ye73Py9NO880ffgbRdSgk//8sf4Jff/4vcc/c9lVvSUii3\nTWHPgWii6wcRGchCkytFUabQ6Dzvmn3hzF/GIJVCKMXBy47SnZ0lqDVozc/z1u/4dr7n3T/AkWMr\nLM3N8do3vJGZxSWCepOjV15NWeTcevNLOXriMC+9+gRXnjjI5QcWeNFyiwO1Ej/f5aCX8ZqXX01Y\nbtIYPsVKM+PY0YCm32e2FdGdS5mbybj15mW+/btfy1zD5fqlIxxttplJJXowwI4LNs9cYO/iNqun\nzlCzXL76lYd56ctuw3Zdgvb6pZMljw5VQDmqODchYTjoVQE8RU5QC9FaEwQBw709Fudn6TbqGFPS\nHw3RAjzfwQ8cbFEiKJAiJ0mnjKZDLEcxGO7x7OkzjKIpF/fWydHMLCxhey6ZUIgiQeUxoW3h2hWN\n0XVclFK0Wh08t0ZRaozJECrBaT1bUd5uhovnFvjMX+2irIsYL+euTz+MV2uzvnEX/fJT3Hrdn5BO\nTlGWbVqda/jvHznJ5z+9yrRvE9qCelBDl4LETEmLiLi3R9uxsWzNietezMUL6+gcYgTaQL0sue+3\nf5cUuOy6y3nJDa+iI8dML3yVVs2niHM6EqQlEZ7PsMjYSifs5SM2J5ukZcbZQY9TDz7O677zm1l/\nepNf+vFfoMg0GHAsyY+9/9/y8c/8KfX9CAslLGzLBpWQqcsoncMY/wSlexRJQCNYet41+8Jdw0Sl\nqp2MRxidISjwAwdNxmhvh3LcR+URACbKsPICKxmTWClWGePKAo8UlY4R8QQ93CUeD1FhHaklIjGI\nxjw5isJoZK1Jngks4yELF1m6lLnClS5pAt2ZObJiyvruDsYS6MGEeG9AoHymgxg9jVk9fYaZdp0s\nHgMpllqDp8FcAdPeHJayMaZEILAtl7DWwHJ9lhcXQUukFGR5QqNep7+7S7tdr+YxlMRlQqGqiIQ8\njRmN+0yjKdFwTJ5MaLbr2K5Hd65DarlsJZqdomCQFYx1SRnUGE8jdFGAztGiyhwpyhKjBYNejxJQ\n+/zfspTUZtcvnSxfPT0mTw5y7bWaJx97mpe9VrJ9/hxL9XcxE/4rErNDZyXmyae/wFVXv4VP/+UX\nydICXaZ4nsNw2KcsS+p1i15/A98W7Fw8y1Ir5LJjx7nmhhuQjk2w7zyOjOGV7/leLGEokilOOKLu\nKAAAIABJREFUfQ4tBS1nlbIYMo5SPOkw124ySQac3V3jmVGfJzbPMcljPNfGbs4xmg74vd/7ELuT\nDdYfPkNvY5dkPK0AfpbBchRf+Mo9zLZbWCLDyqdkozNYg7uwR3cj9v4cu/9Fyuws463/IYjo0vOC\nbRZtDAJBFOecPXOeZ06fZW9vj631NdLxlHPnNtncGtDfXCcqYra3NkjTkrKf8PSZizx57v9m7j2j\nLLvOct1nrrx2rpyrujpHqVtq5Wxl23LEgG3AGLAtwsHAOYA5pAs23IHBxoFrHLBxDhjLUVaWlVNL\nnXNXVXflsHNaea15fuyWbO7FutzrHzpzjD3G3mvt2lV7rPXV/OY33+99FqnGKnMVl8WyR13otPyY\nSqnKaqnMaqVKpd7ClxquH1NqSVbLHtPLRU5PLzA9VeTsYoXT80XK9TrN2Ofw7CwnlpfZf/YcZ4or\nTC3OMb06z2xxmeVyCcuy8H2Pqbk5quFRxCkJk+BGfZ0qkOuRSplIVSGMYlTdRDUM/DAARUOogjB2\nsVNpIs+nUanhu21Mwz6v2o0IQgXX9fDbDsVGm8XKKj0jeSDCUGx8X6JqGUDHCBLspkfeTVBjSdzV\nxUq9gRAgVEEgPZAdwznbMlFMDS+MUbUuKo0SPYUmHAN5MVyzfiPX7FqjPlUlCX0eeKCBmnVx4hco\nlR4DsR9buZ4v/fM0l+zZDkmWOBRkMzpO4JNJ5+jryjM+mCOT0ZCGSkhCSjdoTZ2m3l6l1qgQygip\nCHTVwB4YJk4SDv7gaZzGMn6kY6OjukWCpIIMAwwBU2tniTTBzi076E/30m1l6VVgJJvDNAzStZCr\nb7kOv9XmL3/z90mnu/iDd/0ZqtQ6Xsm0+eyXvsBlN16MrecwIoO8FqGhkYiIROroahdZ5X/jalgi\nQdc0Rvu7O6JD20BRNERXP0IoSDvNzMFnsG0NNXZo+GA0FW5+7evA6LAPeyZ3ohCj6jr14hpOs0Yu\nk8W0bbwoZGRyE4q6kVjNMLhJIREhShjz5COPsXdiE5qhEgud0XVbSKdNomqNMJGYuk6p2uDpJ5/g\n1bfegmg5hDIhncuQtmwi64XOeuUCqJX76RroQcZ1BB5etYKRCISMSaIIVZF4bQfbSFGPOjOqpmlU\nKlUm1m/gkbPPI6RkQgikDAniiEAqrJSKWENp0qk0fWQoHptmdbGCEyW4iobX9tBzBWqKQgsPeyDF\nZF9XhzDQ2QtHVVU0Q8NzPQxVwTAtFAmTWyKUA8B2aLgGBaXN0yf28aNvOrz/49fStXyGMzMlhno3\nYugVkmCND/7tIQ4eDJBKDsdTUZSESEp0VcPUFbw4wm2HbBod46EHHuGKS/eSNi3wE44+9BTbdu/B\nHNZo6AqFfJ5IkdiRQencMn6zju82CdUYJVERQYswyiM1i367m8ncADgRQiY4cYAZKRTiJhsu2cNy\nqc6Z+/fTbkd05XSeefxxWsttPvX+L/GeP/o1EksgspL/8x8+zj988PM8+KMqYVlHDQS60Ih0A6RA\nEerL3rGv6A4+dDRdkefg+x5pkaHeqNGT6yUIFUBh/vRpMiNdRCKhZ3ATjrtAa3kF1UgjhEGspVGD\nFn6s0XADenvzKCoEiY+hqRhWjlariXCbqCJB2CpmxuayvdvRvBqRtDGSGBGG+JUSBgpqELLSWKWB\nhZnOs7Z8DkOxGBoeJgp8Cj05Wlqr48PpQCY3R316hrTIgBqTMrPEUUQ6m8KLfRq1Gi0vwUcjZWWI\no5Ao7GAzHD/Ajy3qvotuqIAgURUcp0PXrS36fP+Zp7h01zbSKcHgnk30ja7jwLGTXHntVbywvETG\nCTj4wH1cnR3AskxELNANDcPUsCyr485iGyiKius6ZLt6SPeUXkrBTp7Q+fDnaySZ9fzp+15D3Q35\nzpeOsXPPrcikyeatQxx8VuH04Sd5/9+8nf/5Pz/V0XJJjWLJo1Guo6guE+PrmJ+dI5GSHTu2EnkB\nb/3T3+ah+x4iPLnM6bkGu2/Q8GwTdU8XQka0Kg0qazUarTblyKFebCIDEyObJ1A1lsqzmOkYU7YJ\nQ0Ghq4u1Sgk7DSgpyosrDIwPceToMdZffBGLJw/yiQ99kpGxTVSn29z7yYdZbM6h9AmuuOZK/vCP\nf4M7f+NN1JtVfv62X0RVdBLTJg47LQMvN16xYFGFCkKe70sBVBXH8cjmCriBS4RCRslB2Ka3b5xq\nbY2wViZtWaCYSDR03UQkAcLQ0BIN0WpgSBO3XSedzpJYeXw3BNfDThn4voBWDFZCqxlS6B1GVWxC\nv42meIR4pLMZDj51gE07t9NXGOK5o2cQUqV3pB83jIgQzM0vMVsc4pp3GShfCcj9axP73R8nmvpz\niCV2OkUraOO4IW4YoOkKllQh6GigWu0GIDCzeYrFGtJIWK22EMIgZWlIV5DKakzY3bDQYqbW4Psn\nj/O7//136LUNhG6TzSTMaC4lkfDEI4+yad3m87OIiapDlPhksxmC0EO1NFK2TUQHRBqGLYY3LMNn\ngLdAuZzlj37rzWjxGjt3tZhZvgZffoVqaw5L76JUUth7zWW8M/CZPqWzfddmDhycQ8oETc8zvmUn\n9dVZ2vUiCZDSFHLdWXQry5c//Gl++R//GkNk+OB/+yOu6CpQLK8SBj6ODNDbgKFQaqxQdlYp6Cl8\nwEPw3Pwp5t0yV1x8Ia2ZZWw1hW1lCfwEx5JEWkKrUWf2cJGtl13EzNk56o0mUkb887/+d8IkIl/I\nIYXk9NQ8xZUSPdkise+zbsMGYlV0XDeVFEEUdSCsLzNeMXcXN3bRhWDh1ClapXmymQythsPC8hKq\nTEhUHWFa9PeNkrKguLKEITTOzS9TSGdYLhdxWg1ifOJIEsUKI2Nj1FoNHNfB9dsE7ZDhwT4UGdNs\nlIiTgO7uAXr6h6hXaxiGRb53kFqtxLED+xkfHkcPXMYnN5OxBP2T43T1d6N5LrVGmyOnz3L4uX28\n6U2vIRSS3tEj9Grvh8uBz0Fj262Y1ffgeSUi3ydoNVABt9YBDti6zcpqiVy+wMzRadwo4fTpGdbq\nLk7D5ecv3cZg1iKWOh6CROicWG4wnhvgyAPPgp/QHNHI9g+xWKrgVNsIRwU01u3dRRAXGezrQjcV\nql6VulOnf3AAP4kY7B8lTARNp7P2euMffhNzow9Pw4e+eBk33vF3pNNHyOgeS805Hnu4zMaNYxw4\nfIjJiY3Mn13i4QefZXFGI1FDgkB90SKM3Vv6ueGCjRQMlTPzRQZ6uzA0hStvuo3x3VfzuY9+jAsv\nuRy11KRnYoRHv38v7/7YhwCVH3zwi0Suz/P7HmZ6Zh9DdoYmWRYcj5YmUDNZ0lJg1NqsHxulGPis\n1NtUGyVsoSASiZm20VWN3/7zP+aBe3/EqYNH2bl5AzfcejN7r7qEb3zt20hXsG6on/t/cDdvf8cv\nsRAs87q3vY1br7qGSN9MsdUkFZ7g8NSZn+ru8ooFS8NvklI1GuUSFm3iwMdvN3G9gDBSOX5qlmsv\nvxT0Tqdb7HkoxCS2zcz0CXoyWWh7tFQFr9HukIRVwfBgD26rgqJI8mObmT5xjMn1k6BItDgmRGKk\nOiXUwPOJYzCzBR6+5162DvbQZ5oIIApi2kGTlKFDGDG9XEEbHmG1tMauzWOkTAgdn8L675CZ/Xd4\nE/AEOOqdRGsX4zstYrdN7LkkQRMFhaxpM7+wCqicOz1PrdlmfnGVqiNxHJcr13WxZ90ohmbhJxJN\n1VlqBpiRynBmgMpCmcYLpym2XRoSXB300UGM0THWNA9LC+nOmEQyxBEeyJBcIU8qlyUMIzTDpuXV\n6R2OefVrvgWXQ+uMwle+8Mfc8Ut30mX0YRomd/7mpezZtJvv/+gxdl98GUeOT7EyvYDnK/iOQZR4\nSNkp/0sUiKu854230qX4mNkeqsU1JkbHeNU7f4VP/tmHee9f/CVf+e9/ytDEKAuxz8wLJ7nxzW+j\nvNrALzdJDfTw7XvuImiukdVUAj1H0t3FqflzmNks42MT7NywkWcfexzXbYNuEUjoSqUwUdAkiDBh\naMM4b/uj3+V333kne/vH0HULXwjGN+zi2Wceo3cwzc+/422M7tzMxMYR9HSedr3C7a/5LSqVEDU5\nx4kzB///WyEJIcboUL/66XTpfFpK+bGfFZVn6QaCjl2NkoT4TgvbVNENFaFm0OYVcKvEbR/dzONU\nKggc9Gw/7Xab/mwaU4mR7QZpTUHoKulslqTdIB05xO0qZncPE315hOxwQPS0BUGAEoSkDB3LjCGK\nOXTgGfLdPWR7+3CaZXKGTaPqIL0ATzrMnV1kdHyIVF+BWDbQZIznCjTVYvaZ6+i/6Ah9HzgFrwf7\n6c/guz1IfwA0g4rXwLSzRF6El5iEiU69XCb0IqqlGr4bIISJouuEEir1FgM9JlHooEuDblMl1qGl\ntrEmMji9O7FaHsSQUQQOUEsClMBDtwzkefGmr/gdjZiqEkURmUyGptPCskz6xhdfWq8szXczNlpg\n7tQidz//V7zz3Vu5cLfJq274Db57zz6+8eV7EaqNik3oJ0jFQUrxUr+WICQ2MkwvrnHN1lF0YjKa\nhreyyjc++GF6vYh//c07KdcrzDVLeLbFTa+6mkOP/BCh6NiKTjlYpu1XkdIlMvtRdItWrUmPbZPu\n7UKzNO564H4u3bSddODium0iLcPi0gLCMHCFRDc1FhZm+Yf3/TUZRWUtaJA3bEI/4MlnHsAQMXuu\nuIprf/62TklZJARKRCZl8Uu/8ma+/PVn8YMsnDn4U2Phv1I6DoHfl1LuoJNw/LYQYhs/RuVtBh46\n/5r/GyrvNuATotN88B+GjAMiwLCzqJqBncmhGTZGKo/a3c3Y6Bh6WifyIhqeQ9f4OMLoJYhDwmKF\nqNlGaioBCal0h6/Rdj3MdJpISCI9DZkMR6dnaToeXjug2GxSdQLacULZDTm3WOf+x15gaN0kg709\nnJ4v8sK5NR6ZnuVYeY19s8s8eXqeahTwwuwsxXKRer3FvlNnODozw4njpzlw/DT3fO0qWm/phxtA\n/HKMueHjqJZDqxWSShVwAjDtLNMzU/hOk9LyGlEcoCsKA90Fevt6qFTrWImO77mUK3U0dJJEoBlg\n6ILQCwi9gMBt4/k+LS+g7EWUmh5+O0IoBrpuE4uEiBaJ38ZK2fhRp7Oz2nQwDROZQP/46kvBcni/\nIJUss7p6jl9610fR9Pfy+X85jmVmaLYDNAzw6VilEkCsECeSUEIgIZQKvg8PPXeQhbrL1PQZEjUi\nsTT6Iw29WsYwUyyjsmnLhXjVJo889iz7j5yiWluj0qhRbdRZLtdoxgrtAISp0oqa1Pw2Z2bOsLy4\nwjt+9U7aQUC96aDFKroMGJ4YI5IaKT1P0I4ojG/k7Llz6KogIWZ2bZnF4ix61GDD5hHe8s5fRNVl\nx8lUGBjEoKn86q/cRtQ6StScftlA+K9g8laAlfPPW0KIE3RwEj8TKi8KI3RNw/UDpo9PUy2uQOij\nWzqGnWbr9p00G01Ozy/TDFp4XoxAZXRyC1HXKMfX2kRhiKYbiFaLOIrRBORDE0MvoGcsbC3Driuv\nx/M8ImJylopqmIReRG1pBcf1uPU1ryWJInoHU2y+yCSo1mm36iwtzbHp53dz3/e/z+W7tpJTTdw4\nYnhsFD1l41ebNBaWuWDzNtBUjt1T4JK//yTKqx3EBxpof/BPxPO/hiZTKLJjCZREMe16pxnL8X1S\nmTSKZjJ3roiqGAjTQBgdpkg9jshlsyixgm4YGLpFs+EiYg1NKCRxTBLGiEhB6AqmaaDqKuXqKumM\ngWHoHRf9QlfHUV5VCXwfO2PTO74ITwH/CPbCBF3dNS695NXEiSBJNDZM7ORV11+NodkkkdpBP5C8\n5MIihUB0mHqgKNgkSGHxrUdfYH13F5dbeULbZrnqYPetQw3rXN4zjnd6nr5sF267yVhvDlsJwO5i\npV6nMDpCo1mhr5BmenEOL47I2Dbb+voJrDT3fPcunCTgyssu4Uff+R6jfb302CZDPXkiM8vx5SXK\nB8+QTndRrpUJfB/dTNE73o+RNvizz3wEu5AhIsFQOihwJMSxRDd0Crqg5jg/W7D85DjPltwDPMvP\niMrTdBtFSizLYOcFu3Arg/ieQ6GQJZAJcauKbdlsv2IPhlAIGg1kFEIUohQyNNsgfQ87n8dtO+Rz\nPUzPnGPLUD9SU2g5Me1Kma6BIZLA60CDbImzWuP44RNs3LGN3u4c86dOkrPTECvYhsHpw4fwmk0m\nt25GegFZITACn6rvkrJtTp08QzaXp7xaoSeTY+nEFImi0mpkmHr8rWz6xmcRl4N1wQL9l/yI4rO3\nkgQRQsYoSae/RMqYKFGQms3hF45j6t2IdtKRlBtGh7cYRqyVSli5NOkU5LIGqhaRz5okvqQRNLBU\nMA0NkU2hZhWWlmdQNB9hZIl9iZUyabfbaLpJKpsha2fo3XyO9HQNViDcrZKdv47RyatptgK6+mwU\noXDyxAqmniWKkk7KJdSOIZ7yY1+ulxzzpUSRCoGQ1Bw45bm86bJdHD95DDOX59TcOfSwxU3j66gX\nG2ipbnqkgRLXMTSFuSjk8PQZUn0DBH7ASqWEaVoM5vOkdRNbKvhBROzH7Nl9MSI0Geofoy+TotGq\nsNY4h5cY3Pyq20hcBccps7R6kmJ5if7eNO/4gzu58dW3EtJxj9HOy206X+J8kSKB4toyQnl537D/\n8g6+ECIDfAt4r5Sy+ZPnzjNY/r+h8mQHjKkqECQufuQgm3Uqs7OYXovK3BS2GoNuoQYRedNCJCGF\nXAaBZGigHxVBu1gjZ9pkUimCIEJqOlGi0t0zxuDIMF5llbQSo4ctVpYXmTozxeU338C55UUWZmfQ\nEujKpGkWl3lm3xOkB/K4qY6JXhK7JLYg25NnoG+Y1fkVdNVACxLWrdvA4tIa5XaDY1NnaQQha4sb\niIwLOwSb/wHGyKGO2YKMUdHQhNrZX4lMao2Yg4fPEjTB9DQKsUlYbXVw16qOYZkYlkmz6bCysszS\n8jyIgDBqIaMGuhZi6CFCtkC0qdaXUC1QdaXDHdEV4jhGVVSymQwqHcjQ5iv2wV8BfwKVlQ0o8Siu\nv4FcPo+mqCASAi8hiXSSWAXU/8flE0knUKToeCnEcQwxRHGIhiR0fIbzPfRoOrt3bKPoCp6sV3lk\nbZlz2QKHpKA40MuDxRLHVlfoHRzGrTS4ZtseUobOYHcX3WYGI1Zo6hZrTsD1t74WRc/w3Ud+xOgV\nF6NMDnC0WEPLZhnr7WP+8BFEGLKydA5ThY1bJ/n+kw9w3S3X43kuHac6iZAKUSKIkQg6xLFvfet7\nCNUiES8/d/yXZhYhhE4nUL4kpXyR8vUzofI+8IH3n3+mcOmFW7j+ku3oA900my1W6jUy6yZYqZWw\nsl1oukGr2cDMd1Op1TFTNi3HRRcqXb15Wk2HNdfh6JkzDGzeycrCArXWMQr5HLV6myiUeMUSXqtC\n19AglUceYf7cLBs2baTYXOTRp55gaKCfdVs30dfdh5ZK49smYaPJFTfdjqupSD+gTx1lJJNn+egZ\n9EKWYhTh1BwUBUYmx7jomv3oy4c6SLwvgr9wKbouiCKd8lqRMILAjSFRacQaxaLDJCYUa0R6zNqp\nOcYme4liiaZq6Ok0edPCDzvrlFrDw03ACQWeUIiNNFZvH6GMiX0fM53GdRySWEdTBapiYWk2SAXT\nMBncOEturgwHIP6KwonPbSU9UCdlCAxTkCQBAg3dUnDcCClUYpl0UBKI82lYB/2NBHn+X7RUBOqL\npK60jowCBtaN8YOH7mdy+xYGuwpE7YRCLo9bLdEIQp5eOIfntxHorOvLs2f3hRTrK3SldELfx0k0\nBic2cnxqhl9461s5eew4zx84xJvvuIOllVVOn1nh9le/htlDx8maWZrCY2b+KD05m96dm3n/R/6W\nWI3RZAcCFUuIhYIUKhqSIEpQFHjXr72HRx8+iu8vIPgZ5S5CCAF8FjgupfzIT5z6mVB5f/4Xf9ax\nsUmgsnqaUHqUqhXsdBeGnSWoN0mkhqoKPBPM2ELJ5chYaQzL5IUnHmfXpm2UqmW6N25CJjFv7h3F\n0DS6xsfQc9vQvTaeTDh+4gTbdl8HxLSDADeMGe/tw1JNFqo1bthzCWrWYGl1iYGuHIUuGxHE1NdW\nWJ4/R6qQ4+Az+9m8fpyRkQlaq1WOvXCcqBWTUXVQoLfrIVJ8C24HPgzuBTuoPX8j0gxotQOCUFJa\nqpN4KtX5EqWVNgQQxG2GsJlHMOcY7CSLocQIEaKikEiBbmVI9BhVU9FCB02NsTQDkbIQho7rxUSq\nSdq00QwDGccIVenwSpIERVUQAtZf9Ry8E/gTmD6+C7N/Dw2pE6mgdArztJr1jsNk0kH/SdlJX+DH\nqZdQfvzZHQYlICCRsLxWZKB/kGKlyPrxCR556gl+7lW3cM+D97N+x1ami1W6+vtYPHWSO667hvnl\nFZpth/mVFbryBZZqLfpGh8n2DbDv0HFed8ebePaZZ1icX+Btv/hz7Nu/n7W1MrfecjPPHzzIStth\nV18/MmqhJZJdmzdxyd4LmX32AJuuvohYVdBEx6JVEzFq5KIkCoYwuPOX38nhQwt0DV5A0gjQYpfV\nyk/vw/+vpGFXAb8E3CCEOHD+cRsdVN7NQojTwKvOv0ZKeRx4EZV3Dz8FlaeITpqQyIRQCrI9A2hW\nhlQ6QwpJ1jLpMWxMQqhUaFaKxI0qiefg+Q5J7BF5bTJmmqTlEjs+od9CBg10EtRqhVa5xtGDh9ix\nbRsijjoOjnFESsZU5+dxy0U2TU6QTZkIP0QRHf66jkYcJfiOy+nDJzj53GH27tyFZVocOHiYUrVO\nFHUM5GQsWLdrma1774Jbgd+D2nWjNA69B7fq0lhp0Sy3EU2BHdk0Fl00tZuc0YVMVDzVwAh0Al1B\nzeXZf3gBD4VASsJEQ6DgRXBmdpWzqyWEUNGsFIZuEgtINAWpGAg9heN3FMZSKIRJx+XRskw0RWXv\nbZLMuVU4AMk7dIT7C/QMDTEyNoYSayR0sNnZXIo4jomS+CU+pKqqLz2EEMTnIUHi/K5klEiCpOOw\nousaRVyqpTJ53ebKPXuZPzdLureLA8eOMD4yxsT6jcSWydpaiXw2R4Ck6HnITI5rbnsDVTdmZm6R\nN77lzZw8cZLSWpHXv/71HD18hPLaKjffdANPP/EEQbPFza++HV9TqbZ8rtywiWEhcJ45TPnfHyRc\nrBEmMRAjIkFY81g6eZa5kyV++W138vSzR/CiCEfWcaIQX/0Z7VullE+8TFDd9FN+5m+Bv325zxV0\nzNGkKunpHiERksHxndTLCxjCw86YuNUGotHJiw1Vp1WrEiEp6D2Yho6ugx2EBEGAUASmpeMGPhkh\nOXn6JLqZYseOnbTaIa1mg6xt4TptDh86ysTERsIkYmV1tWPwoGmslerUKsdJElhbK9JsNunOZdix\neRsrzRKnTp2ikO9jqVKjUa2RSJXRySIX3f49lNdIeC2s/UKex77yZkTjFEO6RtoPOH78MK3VFuuH\nN1BfcSjX5vHVVEf6Ig1CJNm0QU7aLK5W2RT2YRogkg40NlEShkfHiYSC9D1IVGLVAEWhEXjUGy79\nw300/TqKZVKvlSkYOl7UkedUSyVSww/Dr9NZq5y9FHXWoVSdIyoMMThx+flrAjIJUVUFVZVESYyg\ns/HYUYl31OIIQUKH9KUpClJI5Hk7JVVonFqeZSxlEzkBvUYKNyegVkQIwfXXXcea2+aam29GFldp\neQF9kxP0j46RzxZ48NHH6SrkuezSvTz79HNIdK659hp++MO7URR44xvfwHe/9z1MM8W2jet54qGH\nKTttJvqGGR4fYnlxjiNnjrF7eAzx2a9yyV++l1BKFN/lhUee5u677+Ox/dO0/H6M7MVo0sCTYHVf\njCJd4KeXj18xbVgsI+R55bGqpxHnccvnpk4j2pWXdthbtToIGFs/ydryCmosSeey9Hb3UGs5YIBq\n25RXizjtkNHBIQ4c3sf2q65GNyyidpNsWiXf30NxYZlMNsPNr3sNzVCQVgRuq0G6K0ukSHrdgKcf\nfYZmtaMIGFk3SuI3MKyA2lSZrnQX9YZDtVVDaBojIy3u+PV7MN4ewyZo/KHNAx+6hbhdJlvTyMkc\nJ/fvQxgxfbkRTp1aJYo1RpUUwtAoOip11cJN2pgthWNJif7Q5Mnnprji4u3kiFEExJGKqgoizyVI\nIgSCRO34HcfEHH70WVAU8uO9DIxOYHUV8IVPCkGj5DG5e4HMXKkzq3xNY+6eKxnq7Sao1EkUg1jr\nLEIkktlz8x30ndBJhOTF2o1UgESS0AkMCQhF4MdRh8gmxUu4h7m5WcYnd1Fqt8iG5z2Pg5ih7l6+\n95270HvzpNI2r3vHO5iZPkdXbw+zMzN8/+7vsWnjdnZesJP777+fnt5+brnldr72ta+ASHjXu9/N\nxz72EQYGBtm+fTv3/+hButN97L32RqK5ObSGz+m1Ena2i7NBi3WrJTSpQgSf+Pjf8MPvHmF2dQlF\nSHx/kciIsMxJQukhzRQq/5u2Fb/zDbeRSlv0FHJs2zLJpZfspbu7hw39XcSOhqLpmLpBPfRJPJ/u\nbAE7khgJqIpKu1gFpYFumVT9BUY2TlJTHZZOHWf7hvXExTVUM4XXarJWLNJsNdk6uY7a4iKyUWJu\naY3erh7USFKZC9HMLM/vP0bT9bEyOUwrYcf4MLXFmKUjx9E9l1y+n1KxjhGr5PN1bnvH97B+vwMZ\n8j9q8PCHbmXIGUet1CgsOjRas7hqhtVikeryHLI3TbNRww4EbhCg927h2re8nZnPfIic59BQQjbp\nDu1mD8+cK7N3JE3aTiNJiMIQqZnEkYKQBrHQUbA6TvGBpKCpNGaKyAYsNWtceeO1tBQPU9N5yxun\n4DeAP4GVc9ewUjcZGummZ6tC3/h2dNROAMoOB1LTNOJIoiKIOw7kyPPBIM4v5BUhSOKx3sQsAAAg\nAElEQVQEXdNfYrEoQJgkHJw6SVBqIgs56mtFNA0UIbhw+wU8/OiTfPyPP8Pj+57mwfseRNF0jhw9\nSq1W44YbbmJ8YgOf/exnmJiY5NZbb+ff/u2bJAm84Q1v4KMf/RiZTIYLLriAe++9l6GhUTaOruex\nh+7l+okxqo5LuS0Z2bkXWUjRLk0hkzpRW+Odv/fHPPzkezGLc2iiiaL4543h+85T3Wbh/0X69YoF\nyx0X7MCybSY2jLJu6xiamSYiJnbrKFKiSYGMEvRI4rkuiWGgBj6oKr7rYVkqXsshY2qk4oRz+w/S\nPzTO5slJQq9JkkjcsMH82Rn6+wfI2RbtdotcNo/jurQaLQbHJkipJivLi5w6fpRUKo8S1Ni6aRNJ\nWMaNfBYqZXZs2oJqZXjwkcdwIugetLntV79N7sMOHIP4XpVn/uZqjnz3LBMjMet7Bnl8+TRrtRb6\nyDhzmRRFL0BJoKFKdENFD5pIXSH2BW1LkG/HTGh55qI2WwiZqTQ4kcoyhoOp6qhJBzkXQcdkW9GJ\nUfENg1qSkFdNlABEs8GQafH0w0/ROz7AL/zWIPn5tU4F7Ksac3ftxWsUmTseUGs4VJoKW6/ciBQd\n+nCtVkMmEnHe6ERIOr9R0wjDTrXoxRkEeGntImUnS1BUQSsJuWjPRUhUMpPbKTVKRGGA6oRMDvYz\ndfI4V15+JfvYR6lSY9u2XfT09BCGIR/+x49SyPfy+te9ia9//WvEccyNN97IXXfdhWVZ3HHHHXz9\n619nYmIdg4NjPPjgo6SHujDzQyylCgzvuYo4nSGzZQtdu/aiaSYrM7P0XjTBn/7F7/LeX/8t2l4d\nRUl39pHUFIH0ySQgxMu7u7xiweLIFtfdcB0pVSMlNap1j0RG5Af6cX0XNIEaSbqyBZ4/eRpdMWm7\nIb70SaKIxPNxQ59Fp83c/Cz5TB57cKxTVbFsZOjjmiZDOy9CNxRknCCjCCdy6NuwkdzmbUip8vwT\nz3B2fg6hmCzMzHDhzi14YYMrrriMttNkfPt20Ew+99F/QTMNtl4wwbar/4HubxThLkgehUf+jwuZ\nu6vGqGVSWi4zVW/Q8kM27dxO2fHQtBzDsUekG7gDPSRRgromqa9M8/gXP8hI4uInMbElWAwF6XaT\ngZbGVKmCiNMMFCClSUQiiBWNRFeJhURHcP8zz5HTO/a2s3pE2tFYjHwcVaF+donxrcfg/cD7oF19\nFcNWP0ZBQ7M0Km2XQqHvpYpXIhOOHD2IECqJ6CRmqB2b1DiOX+IyvhgoLy7wX9pmUzrl5CCOGRgd\nZn7fcWpdGfKqhSsVlktFBgeHmZqeYa5cZtuW7fQOODQaDR597BGmpqbIZtO8+efeyDf//RsYpsbl\nV13O3ffdg5nOcNnll/CFr32J8bFJBkfHePzhx7nw0t0MbtjE8/uO09JCurvGGLB1KuU1ThTKXCSv\nINOdRySwfnKMwG9hSgtPkwh8QstAhMso+CSx+bL37CsWLFe+4c0kuQJtJaKe+LhKQhwEiMjvkH09\nh6DeJD+WZeOlexASJjeME4YxfhjiNxtYiULgh+zafREoCrplkXcD1uYX6evqp7s3T743z/OPPs7G\n4RH8OCCbTtGu1qnX6hw/cgbf88kbKWKhsuI2CZwadrfBUw89RCqVoVxvc+TISax0mnx3iok9/8Tw\nc0sdf+Mn4OlP7KD6zBjjG03OVFYY376N6SceJ+VFlI+cAkMQpBVSQkf3QrqqNbJ6ghdEZGWMTCJC\nIVHR8ByXfDaHSsCIkWPB91lsakQ69KgKWd1Cmp1d80QRSF3n2NQZdig99NpZjrVaVDUIFa3T45M2\ncbwV6AWehNSvHWCl93oyI5PkDYPRzSbSHkXoOgiJIgVTZ2bwg4BE6bDt45dQED+eTeAnysgvHZeA\n6MhiAC8OGUjnmWs3iZMExVDpKnRx+MxJLrzhWrpHhpiZmmHm7FmEImi327z2jtcyMDDEpz/9abLZ\nLNdffz13ffvbjI6OsWXrNr77vbtZt24bGzZs5IEHHuKK629AqoJvf+eH7L7oWq654ip6sxb3/eDf\nefwHj7HvuYd5281v4Mv3fRs/ibn7+w+AkpCEDhEhigJqoqLFgCpBvHw/yysWLD2pHmQsyJgp2l4T\n29awewvEjRaqZRGaCWlbEi4tIQXIKMY3TcKWi5bKMHd2jnRXF919fYR+AEhmz86B0JgYG6dT24ko\nLy2RVRVEu0U6axI6DstLq0xPzRIEkCSysw8hE5IoYMO6dYRBi/7+IVZXSkydOocQMRffcIZdVz5J\n+p42vBd4EKbuu4Du0zeyqs7xwrlZjPFhHjlwgGwYktVUmpU1+rp72FCUzClN1IFuIjXiXBgSZlRM\noSEaAaro6K90BMPrN7A6fQKttUZcC2gN9WH5FiEJkRliaAamahBJgxemZxCoxGqI7kR0SZ1V4dNv\n91DxHFKFHB/7024++c1l7NeC9qEiI3d+hpOP/Bptx0WYFvmxDKncMImUqEKwfsP6ju4L/sMs8uJ4\nETj04vkXg+XF3hYFga5rLK4t0QukbJu218ZtNcmk0gyPjvCpT32Kd/3u72DbNtddd22HLADs37+f\nL37xy1xzzTWMjo7ywx/+kAt3XcDExCRf+upXueDCCxgeHOH+Bx7m2muvp+o1eP6Z4/zcz72dnv5e\n9j/9IM88fR9Z3Sdv6Hz+X7/J+t17qFUbKJbFR/7xX4gxiBXwQgXz/N6QjENiofGf6H3/w3jFgsWw\n6NBsMcCDVCFFw6mTVgSxmqDEKlYmhbfawDQ1QhGTRJJmaYWlaoNdl1+Dls7iNRpops7c3CwDvf3Y\nqQJSxjiBi+IIMn0FGqGkSzepl+ucOTlNreoiYxUvcYmTGN/z0XSLTFcvDz/2JKap0Wy3EIqFXajx\n6++9j/yTbbiBDgzmu/DcwhaWv7uF0U0FujRJoWDTs3UrU/es0GMVqDkN9JEBzjXqWNik0fHKLqqV\nYcppMqllGOzvoRwtkvgdRomuKSwVy6w1Kqwnzc5YZ1kTuH6IamjUYo9UpKOonT/j8UPHsbDo7ulC\ndy2G2zbH4gau5yFMg7DSYGnF5u/eN8Rf3LWMcgWkt5+ld8tXmXn0ZjZsHOzo8JTzFS8k5bUqqqIS\nJJ0SMT8RGPBj4YsEOC956WDIOu9NpEREMU8cOsC29AAnl+bJ5TNErkscS4Sm4LRbfOZTn+Y3f/NO\nTp06jeM6FItFFEXh3e9+N/f88IdMTZ3hjte+Br/t8d3vfIfbb7kNuzvHt7/zba6++nKkjDi2/xRv\neO0bSPw6n//0Z1DcOt1ZnbYjaDcT/v5Df8ftN9/Ev3zqS+y98Tomd17MmVOHCL0UURgRxy2EUAjV\nkChWEDLg5cYrFiytlQpKymSmWEG3LBrLawRJQCPyCaIARdNplUvIJKHZbGDYNqXVNSyzwNj6HURG\nDtdPKJUaZLqyrL/yMlRNZ21uhcrKKut27CCSLpl8F5f13UK72ebYjx7EyvdhJm1azTYiAl1EdPUW\n2LFlI4MDvcwur1Kr11laWMYPAy6/fT/5j7fh34H3Q+sGmxMPXEr17w3GbryY+WYD4QcUhgZxlZjR\nnl6GXJdcX55y4IFhs9huMNtymRxZj1Gu0KsZVOst+nWLQV1hMdCpihCz7eMtLjCZHkBrugzJCHXN\n50xBEHfnSccagepRiyVrtYiCaeN7LaJQ0FZjhrB5wpQMKx2sqKlapHIZHv9+yCc3dvNb36nArTBx\n3wms23YRt6+kNzdJU6aQUqAqOk89/SxC15F++NI6BTifXHXSsUTyHypHLzLvk+Q8IFYRrK4t8/ZX\nX48iwEinySoqrufRCB3ymTQz5VUeffRRxteNQRyQhB679+zhk//8aV57xx2s27CBtbVV7r77h1x6\n1VUU+vv46le/yt5LriCd6eOe+x/j9lffzpMPfZ+12ZNcsHUSJ8kyPb+MLztlbUPCSqnM8MYNPPnE\nI3zgb+7kY//X53j4R1Ooio6m1UkCEzV3KbEekk4CYP6n3rOvWLBkN45h93czIAW4IdgW0mvjB52F\nvqLphJ6HVg9YW1qiWCly6ZWXUUhnUaSCE1SZm59jdGKCvGXhTE/RavmkBoa5cPM2VuYW6Rnqo7Ww\nypEXjtCs1nGCAEPTUaVAkZI4dBgdGiCfzeGHDifPnmFufoUw7jjQxzJhYGwB7gU+Ac+Hl3PsY1cz\n/9hhrhro48DxwyRnl1DCADkyxOH9yyjLRab9NhGC8cExUsLER6V/cARfxmhRzKiZ4ly/zczaCutS\nGUQUoEgPRTNJgohUf5pivc6wbaA1q6QKXSTZHubdANsHGcScWVxGU3QswyLnJhQby2y74lauHR5i\n8cn7sWwNLdHp7ulmvlbkK/+UZsdOnes+uQpvgP6n/40DZ1ys1PsQqa5OKCQJlmniRTFSSlS143aS\nnF+fvJSC/WSgwH9I1ZTzJeiW51Lo7yE66pM0NBqeQ2BEREC97pC4Hs8//ihra1sxbIt8Pk+pUed/\n/OHv8ZGPfZTJDRvwvICbbrkF3bT48te+xqWXXsmWjZv4t29/i6uvupDvfOEjaBpcefXlnDx1grVq\nHamoRFJCGGJqKs/v28ftr70dL0yolUt86APv452//DscfP40vuIi1EFCUSOvdWHp/2mD5E98t1do\n6HFCPL1I6+gZKgvzVGemSRptkuUKqVaAXm1hVFvUK0WKpVV2XrCTXCaPZRksLM/TbtTZvWsnlq7h\nNOqogcf4wCC2blOv1zEUlcSNOH7gGKViGdePUBQLL+g4RAZhx4tYNU0a7Trz8wtMnZmj3XZwHL9j\nYCBa9HaV4AgkF8GTd28imS2zox6SOrvCld1DXB6mGWmHbIk1RosOF4YaN2j9bBUWq8vznCrN4SoK\nvesn6enpxQlbCNEhFDtArGpsuv71FC6+GRSbLkXH1CwKo6PM+T69bVjnC/pDAyVQqLRjQqkSxwlR\nnNClqAyagoHRYWaPHeKmdZuorK2yOrdAJARqIYsTRIz0DPDxP+7n8EQW3g3iTXDh1fcijemXdGRR\nHFMtlYFOBpYkHZ69/E8W+VJ2OJDi/PH4vE+wKjqCy0CRpPM5No5OUMjaxLqOIj2ErJEfNFDTAt8K\n8Yhp1JuUKhVWSyVOnjzNn7zvz7jlttu59rrraDQbPPrYY2xav4Err7qUb37761SK83z3W1+kv8fG\nsgye2Pc8S3WXQOh4cUwUhSSE+FGIoen89Z//FRddsINatYaiKHzuXz/EYC8Yaoxpg5oU8etneMsb\nL3rZe/YVCxbVEDS9Jrahk0sU7BiCtQqGIqm3q7T8BnML50BV2LprJ8uLy0gEczMLjI1O0p3rZenc\nCvVqm3LdZbZY48CJUyRC4IQ+1XqV559/ntmFeSr1BiuVKsVmg5LThJTOwPoxpJWh5oU0Y4Ebi04H\nYNIxqEtn06zf0UbZD+yAarOXLZs20hO67I01+ppttDgk39tPkksj0wpJ5GK6HlbYZMzW2aVo7PEU\nml7IwsoyCytr5Hp6qCQ+aSdm+9ZtVJwW6uhujOvew5LjkQMaK0VUoVAHllMq08Uih44dR9i9uInA\n8UNiVMIkgThkSDPRLYv1TouD//RRBkWCranU6w2efOEgmp0iCiL8us/n/uJSVn9FhQ2g3hlSGHkf\nitIAEoSi0Xaa52eQn7hY58nQL6ZbnUMvFgHil6piL739fLXu0aefIG2a6EpETz7NrvFB3nPz1bxt\n907ecf0NXL91F06lTdvzMHWLHTt2MNQ7yrNPH+LA8yd49JGnOHzwEN25Am996y/w+S98ktnZ4wRu\nA1s1WGqGrHkxrqoRCIWW53VmFTrEZCFUFEWhN1fgV9/+diZGR3nyqWfRUlk+8YVPIc0MMrKwZYwa\n1rnppr0ve8++YsHy/NP7OHz4GE899xxP7X+B/ceO8sLx4zx38DjTZ1c5fWyWdO8o5PtotkLarQjX\nCRlYvxlfgjnQjT3Ui6pp9PQPsnnzTiY3byZ22ihuk5wpaJdLhI02MozpzufoTqfIahopFdxaGaRL\nFLh47TZe26HdqKMlESk1IQxcBkeWOh2FV0J9eZRWbY3CUC+psQ0E6RzVcpOVdgUqdeIoIdYUlC4b\nK5Mim89jJgqJDuPSY53QiJttHMVAzw4wtGMTU4vn0IRgcf+DyBOPkbFtotijJOssN6uUc2kOiQij\np4eB7gwZUURTYxwnRJUKhRgsoeC0A3r0LkpCIZ94bBIpUpHKOlWCVNm5cTetpouIE1bnPf7pv12I\n91EBU6D+3RL57DtBhigiRtP+F3PvHWXZWZ15/973xHturpw6R6kldYtWzmoFMKAGkYwMzmBg7DEG\n883YHnscZuzBJjiQwUnYBgmwMEpGEkLIVkChgzp3V3dXdVV1hVtVN9+Tz3m/P251Sx6PNesz3yzN\nWavWqnvuunfdqvXuu/d+9rOfR19BueiSKKW2whlLEQpUkiKUQKguUCy1eEXUr5ttEpWS0oXE/+7R\nb9Mz1o9pw3DO45bL1pKzfQrFhHVFxa51A7x352Z2rRvk1vWjDM2d5eYNq9h9w/UkQUSr5bJ+64WM\nrBnmt379w5w++hK6UigBHgmNSBEiCVyP2O9gGRpSKDRpYZo5nIKNZVskKiWfyfHB930AyzT43Be+\nxJ6jx/j4H/4eod+lUxXKBartV2/wXzN1l/r0fgxDx281MVVC2PIIiFheWoZEccG2bdRrNQI/otFo\nMDo62i0V/ADDMhGmQbPeIGsZBK6HiFOk6or2LdbaHDo0Thy6xAoaQUBpcAAVgopjEIo0SbuoW1c3\nlcD1aTRqmJaB1AW2ZnLLT97H2Mcm4D3wbOYNFMav4/h3nmJ7VEQszLG0bgC7ETE7NUF++zZe3LeX\n9UG3NGkUTXwvwpOSZ/WQzYMbmW/UmWxXKBXK+HGAEgk9vqJgaMi2C1FEU0s4Y2hoZoGBLVtJpyeQ\n7TYyjUFP6RkcZT4Oqfg+se9jeS2uLQwj9AzNRpueNCESiuNJwMCG9eydX2DQyFIoGKT1Nr4b0mPn\nueJ2l/f+9iHk1cBnIL7jE6jkA1z5ustYrHqk6St4YSu9ystlmESpmCSNu573wkIRIdC6KwFSEkuF\nJTWsGC5d18tvvuUGCsqknYCmQRiBIS3SpMW9B8eZXgxZkzrE0uHOX/gQzsgAX/zrL3NyZorTp8ZJ\n0rA7PBUaqRJI3SRB6y6cKYUQiiiKMU0Dx3GQUpAkASRpl4irGyBg9ab1XHXtNSzXUz760Z/mr/7y\n77nnr/+c9/z4rXR8lz/45Bf+/eou/6cufXoeI5dFy+g0GnXKpsPcmSlqbpvLd1xOdaFCZaFCz8Ag\nY2NjBEFAEscrBzvG0vMsLC8yNjRErlzGa7dpNVucPHqEZjui4ackKYRJCEowO7eEplIM08Z1g64V\ntUgg7fKeVJoiNJ0gTknjGFe59I/NdDPL56C8Zyf1ZkqY7+HQyXkylmIWH3CpWwlm3OaoneKlMesS\njbZSGMrA7kTEdkipWGR6epI1WYeF2UUGizmsYpZ8T5H6zByZWEcJG4yYvsEicaqTkHRp42HAKimx\nQ4U3O48sZ1FpgkhBGxvlmfkGPXqEHccMWTmsyCeHwG97DBqCTHMJApOsk0PXE3wV8dTTQxh/P81d\nv9OAb4DYvQepaQwPj1BvnsEPwvPI8bkgOUdtEXTLNEM3yTgGKtVRRMRRiu/7pEhIUmIFIYoXzsxw\nshNz+XCZchIjBEQdH9NOCWuKvMgxH9U42pzg4rUX8I5f/hne/M638MzzTxIFCUolKOhuMmomCI0I\nSRwG6LogCWPSNMayLDKZruNwFIfINEYq0IREJQkIyenx06Rpwsj6HXzuK1/mY7/0K7zw1HfZvfst\nfOKPP/nqZ/b/eFT8G5dWskmkwA07ZDSdEwePoDIGWSvD7Owcru9jZrM03Q5n5xcQmkTTJUvLS7Ta\nbUbXrMXMOEw0WoxpDi035qUDJ3l+/z4u3Xkpjp6lKTRklMHBwUhZkQcyMTNxd58mdKkuzBGFMXam\nK5sjTQMhBP1jbazpCByIenIcf2aJTRdfy9MPPsZOzaR3aJQr33ob3mSFxalJ/L4epjXB8qET+J06\nkZGnXa9TckpkE52Z1hLlYo4kjuloMYFKaLSaSK8FkYchbAQdcipDve0SxDFLrWW0NGFEghP7WKli\nUUuIlIORagRCMbTpQp6ZfZaBzRvxDp8kSlK0KKKYM5iZX0A3FNbKvklISqLD2bDFyPBmxo9m4bYG\neAA+y0vLHDt6tLtHI1bWIekGi9Qkw8PDCAFnz851SWMi7cLFSYTUFGoFYBZKoqUKiSRVKYGe5z99\n6V4+/5GfYV0mRQJSS/E6Lkr3eeeOLVy3wWCyPs/vP/wIdjHDc49/H8sU+LGHNEzCVENJA01aKAQq\n6fqIpqqrG21ZDsVinlq9jiZFN+PRHbSKtFu6KaUYXjWG1BxmJo7Sahd5/PGn+aNP/h6Tx4/y0Q//\nCl/9+v3/5pl9zYLlhyeOsW39ZuxYY2l5mQt27sQjJhYC3XJIgoT6UpVGa46Lb7iaqFKntjxLwRnG\ncfIQpfhtj3y5zKmjx5mZPovKGbzjP76bU7NTLJ2ushwuMWAMIoMElYCuKZI4IEWR6hq+nxAlPmmS\n4HsJmqbheQmWqbNmQ+V8vzJ3qpdqM+X7f/cQ5cWAvoEyZilDy29QPXKSilclCOusWb8Wv+3T2x4i\n1BX+2Bqe3HeARcsiKxJEHNNUAdtHVrNQr1JPwekk5JRAJREaGo6hs+w2MfIl9EhDpG0SobEUxtha\nhobUiE2bRIQkSvHCdx/j9mtu5rFnn+baiy5BOzlHIi2M0iaaHMNIFXUV4UcJ2zds5qXnnqZ3aJR6\nfZGeKA8O4IIXdHj08RdII0GMD5hwLqugMEyd5doSuq6jGTpxrBBaCnqKUAopdUAghUCQoIQiJUGT\nAtIYz3T4yKfuZt2gxU+9882U05RhpUgzDs3GIrpdpt5uMTo8yHSnARHUgpBEpCSpBMNB0v0iU0kC\nKkEIiaWbWLqGLgVLy/PkslmiwMM8v82pMAwbP4lYs2EduVwOJ+tw+OA4laUF/qb2FT78/g/Q8nwy\nnVc/s69ZsFx2xQ7OPHeAkUKZgqE4tf8FPN+jMDSAYVi49RamoZPXE6Tv4S8vklMRTr6HpcoybhAh\nUsHx4wepRx5WPs9CZ47D33+Esws1dFlApj4in8HMZNAMhUKjo8Ucq0zik5DpgJEG6KmGiGOUiEji\nCDO2yQ9MwkPANXD65AjaQottpsWAErw4cZANi3mKZkJ09CShDKlGIWptm+kzp9DNPHpGRwiLqzds\n4/Fmg1nDBi+inDWZn6lQtxQzOUna9hiTBpbUMOIYqULypTKtBGwRkWoJU4FL3jBZVehl1UiJuaBD\nkArCJCEnNLz5BTbkexifmWRV6MLazWz97U/ztZ+9kTXFMmOlHoglZ6tVMk6RHrtEppzHap+BDOBC\nELpMzzWJ0rSrMilWuF5JsgIjy/Obkud6mXN0/jQ9l2G68PG5HucccoZKSZSkYRXYV005+pWHkF6d\ngWKG3lKRN197I6uMhMG+HhrP76HabiHtEh2lo2s26DZCGgip0xWeAKlinIxBLmvjtpqEvo9jGlh0\nuZ8qDhFKR9N0UgkXbN2GlbEJgoADLx3A8wJUPWDf1Cmcj/wSgxsu5+6//e6rntnXLFiWJk8yONpP\n2HJpVltki0VWb9xIrdNCpQnFgT6iOETTU+J6jUwxS2epw+mzC8wtLOAGMfOzi7RDj0AI3CShlTYo\n9BZRkWBka4npIzO4RsyZ1gJSxgil8LSIZbdKYmnMNT02j43Rp/cQtTyMjI4fBhhSY92WRjezfAD6\n67tYXjhEdk2GUn+ZM9UOc+2QUhBTzGYR+TzxUpVcrsjJ0ENTFkHQQERZZmstVtsmSXEj4VrJ8swp\ndMegd3iA2XqFgTVrmZ+ZJO/kSSOPQpSQsWzSyEBzl4nMAp6pM9pXJm4soYwSeWnRjjxSLUVKRbtV\nIwl8jLLAlRLR8LkkG7FzZAN+2CaOPZSf4C17ZHSYmTrJhtLFtBuqm1k8kJpLtbGAkCkGGRKSLny8\nklm6Oy3y/HwFJFJ20bFusKgVe/V0RU6sO9HXNK1rjZd6CF0hE4hj0Ap9TMYpzY5G2xnimJdyevoM\n61avZvHMJGGsIS0HodtdpftIgRIomaJpXd8XXU9pNqpIIrKOQeInhJ0WhiHJWBkCP0KzLNZs3oKR\nsanMz3P27Bxh4JEAeZGy66ab2L/3ea697kru++qXXvXMvmbQcauZ8MILL3BgYoLxZp2jy8u8ODvN\nVOhzarFFVVi4fSW0tZvwCmVEoUxkd+G9jF1mvuPS6rFYNl1OhcskeZN8uY9TM2fpHevhxMxJFIKg\nUUdTAUGnTZp0mFuYYvHsNNs3bmCpUWeuVWX/wjgzUZWWiHHjgHwhoCiqMAnpNo2kM4bpdUGD03rA\nruF1LJiKHxwYZ7bTJPRcCoYJekh2oJeKCoikIHA9NMuglCS0jxxkuT7PyMAqyoP9JColqjXRUoEx\nUkYZGa74zF9RyfRhz7kkwqMw2IdMJTdccytnGw2SKEZHI5fJk9VzZHxIhMLPm2R0iUxjGrqkp13n\n/h9/J72VOVqtJVSQYKQpsd8m9FysUoE4V0A5fefLMClc2s1uHaIQKCkRmkTJrkCFpmlIKVAqAWKE\neHk4qes6UnZdy4QQXSES0ZWOPUfGlMIgjWISkRDLhCiI0WJFHEb4aUJs2bQ0g/XbLiZnFQiiBENo\nmOgoZWAKSUYXZHQNR9PQkxi/3UYkKWkQE3o+SRpi2Dq5bAY39LH7Brhgx6WkwMnDxzgzNU07cPHC\nABuJLiV3vvUtPP/iS5w4dQZT/F+6KUkUsvOyK3DKZTw3xEjBVyEDm7ZSH58gSSSWAq/SIUybHDkx\nTuqH1LwOSwsV6iKinQFRsOkEdSqdCloEzWab+T37Wbd5PZqSWJZFGkUEvk8ninVmP2gAACAASURB\nVNh88YW0jvucmjzOdddfwpnTk0hp4qUerdYSWc2gd/hUV970Cliqj/DSvjMMjg1Ttw3KF1/CWs3i\nhrEiD56a4FDVZTgyyHspxmyFVfk+3OWzOAnIJMUulljyO+TCGF+IrkaairD7imRsEy9UxEYZrzjK\nfGE9V/3UXTz6F18gXKyxfugCnvPP0pidQTgFGm1Jbz4HQsc0ffKaiTIFk4vLFI0SQ6aO8udQpo6h\nm+Q0gS40MDTSWGHnshSFIJ/ro3Z8nGKvfz5YNOkipUGagJIvy8DJlUZfrcCzL18KIbp9yrmSq1ui\npYj/CXiVgEol59deVhgDADfdcAN9AwOMDo4hDItHv/c4qwZGyGQ6LNeapJqDFBLDshBJB1K3K7RI\njKZJAs/H0rq9Usax0TSNlu/ROzLG6g1bmZqaYurMma4IvEqRKqXftlASzs5Nsmb9CG9565v49J98\ngSB+dUXK1yyzrF8zigTcxRpxrUZzbo6wXieqVHDdJrHuEzTbLM9XOHXsJLXlJpWlGpXmIktJi6rX\nYGG5wsTZBQIvpO41cGOPZr1Jp+Ezf3IBGaZ0Gh3qtSb1dodKrcFz+/eiZQze9LbdNFoVDEsnTWM0\nXXH7rmsxtZTtO314GrgWqktrye2fY/OFm4mFQpRLPPrM06hDRxmRivzqYRbTFDVSwjctypvWMW3r\nzMURS3HAYuoShhFODGkYkxnsQaUJKQJTN5Cxy1C7RaNylNHThzh87z8wnKTYmqSlJfRlDDpnTpI0\nXUbWbaOTCDoqwU8j2ipEapLUV9RFyrrRzahEQyiIkogMEpQgSCIaUYu03SGptcgqgRX4pLE437Po\n0mO5XkOTAl3vcsLOy7WqlzcIX6nqoq1knHPZw7KsrmPbK3ZdpHz5sZTdH1S3sDNMk8suv4KMafPc\nC3tYWFziJ3/iPXzw/e9HVxFB6IEE01ToeoSKG5C00WVAGPmEoY8SCVKHQiGHIqHd9igPjaJncuzZ\nu4fxEye6mgKkyDih13ZYNzpKvVXj3e/9Ce657z7Wb1rP1MTU/26r+LXLLM3GMpq0qdbbECf09ZZZ\nWlpi7uxZ5GCBtGwzfWqOg0cOUllcRAFBEDDbqDHfqiJti56+fmq1Fl670f02jCOiOIVEUsr1YCQm\nIoqIUoGQOkmgWKxVKOtZHn3oASoVl/6eXrykw5YL17E4N8mmoV6Kvd/tBsuvwcETFr0L85x1fWbn\np9ASn6hS5YZsiaEwxtw6zMGpBeKZChMlk4t6VqEGxsm3AzKBT7NooXSDwTTDvB5hDZTJLNeo1ZYx\nTBNXS+gPwZYh87//G+gyJPQ7ZG0TreUhUp0eBFYhz/yp49iD/eilDLFQUMzSXKpywcAIZ2pLnFia\nor+3h9T30VLQIoVuKtIVSaNcrNDDEL/VwtFs5oLay5lF82g1GkjdIE67Si5qZXj7ykwhVrhfauXA\nn8sqSdIVretKvP7LfRgpX7lR+TKXzPM8+gYGmTgyzro165ivLrPvxb3UWg02blxP70AvL+0/ht3f\nS+RVsWSK1B08zyOKU1JiTNOkXCrQ7rRRcYLpZKg3XOrNCkngEa2QQkGR1QS5osPeicM0g5R3vP3d\nPPXUP/Pg/d/H0jMkvPry12uWWcL+tcwsNxjbso7NN11L2pNh/euvZPTai7FX5UmMAHuNzZYbL6a8\ndZBf+fh/Zviqtex6600MO2W0jmJ+fo6lSoXGfI2wE5PEgiBIsHMlEqGYixqMN2ZpRQ2Eimh3OoyM\njiJSi+efOUbchsWFGko3OHTqOEZ/L4enxxnorcAe4GoY35+jd6lB9ugUNzYk6zsR/bqJ3vawZULj\nif2svmUnfUWTDUJw9qEHGK4sstSpM9Xx8ZYj9EKBlpmS13VaUUrQU0aaOraRo9QA31CkUUrQcSm2\nOsxLRTnNopsSgpCCYaDqVQZNh1KuiNsO8L2QNNUIMhbj9Vmu0Er0Tc8T5mzMKCUQMSEhUzLtGp6K\nlFgltA2D05HH3qhJZTLoZhYPNFxKWafrE0myIkAh0YREItC1rt4xSp1fQxZ0AyCO45XASLDsrtyr\nUglCvAIRY4U+o+kIoaGURhDFzM0voUSC57U4evQopd4yd9zxRq67/Aqk5zNSyqC5dWzDIEKj2qgT\nJjFxmpCRgpxlUKs3afkRgZah1gxZWlzC7bQJ4nOfM6Ug4MLNG5lZXiJKNOrNZZycze7db+Tur/49\nkXDx+b/UU1KGDaZnTtGX0/jB/Q8Q9OdIJkzqHZflyjJ9g12r59mjC2iWxncf/w7DY0VOn55h7ZYh\nvMOn6FSbZJ0Sy60OmiYgihkuDaCkoFWpEHsBbTcm0EMypka908Br+vzlX/8Fb3/nO/H8BqVckdGx\nQSYnjnH1bTch7YNoBxSshxk3g1qQrB9ZxfzJCbzeIka5n9bqtaQLiwyXcljNGpN7j1MUJt78IiVl\nUkVjQShqvXlealUpzjTIK+gIyZm5BbZvvIiFVpswk2OREBkpxoDUkCSpRlMKWmlEmIYsixA76OBn\nTYr9ZTqNGmm7hu5k6MTdnRMj0Znwa1w2tgE1MkRcrZEJLE6qEF2zmGt0KMcJPf1DOJbNxi0Xcnz2\nFKv7HRIm0KRCJorFyjRSSgxdEEcKeb5Hkf+Chv8yXX9lYCk19FcIWkgJ6fnK7V9uW57brjRNiedF\n+H6KZWaJfY877riDam2ORx99mIWFBW66+Qbu+/vvML+4gJNz6PgB0F0atCwD3dTxfB83iImUxA+6\nWtFx3JWLUqKbDaRSjI6NcvTUKRJdcmZihge+9tfc/tY7+ME/7UHZo0hrEBV7wOy/fWb/fzv9/x8v\nt9Egly/z/N6XWOhUGZ+d4tvffpgXnt6HEhnuvec7zE4tcfTEGQ4dHedv7/kWLTdi1dqN9GwY5Pa3\nv55Y02g16pg5i0QFZB2DKGyhpSE7b7yEN77jRmICGq0Wy/UGw2MDDI/1c/z0US68eAuf/OPfxg/q\nJKnP4Ogov/2pP8BLnjnfr0zMDzMUaCT1DkY5T1LM0p5bZK5ZpV5t0Jmv0u+lBDNLyA2rOKWHHOpN\nedHyOaQ6hKUsGy68iHJ/P5uHRrl89Tp2rBrFDH1K/aM0/YCeTRspjKxCIYiJuz2GrrNspvRsWkfP\n6jUUevsYXr2WzNAgKuxQbLUx4hRPSlI0AlJm0xC5XCc8coqqbRHuuhlj+3bsUGLkMvT0j+AjqNVr\nTI0fw12oMD4/ReCKlxExPWSgp4+c5aBrEhCvYBmrFURMvtyz6GLlua5NxjkErNu7vFx6nVO2lOdR\ntS6sbJoWqK7Nn+8HVKvLfP/xf8IyHW64fhcTp6a56srryOdLdNohUujouoZhGCgl8KOEjhcSq64I\n4LmVgvOfL+2Oixw7w0KtRjOMOHj4MEePjrNqcJAXnn+OW267DS9uoJsGwvde9cy+ZsEyfmicE8cn\nmVuoM1VrMF93ydt9zIwv8MT3nmFsbAMvPH+AltvBMB1I8zzy8PMcOHiMu+/7FntPnqKWKJqNBhnH\nYmikj49+9MPYJVi7ZZAd127lzXft4qY3XgkiwpQBZytnkZbgyWeeICVk/OQhNmxcw9LSAs1WjaXF\nGhevX4QngBvg+cMmuVgi0pSznYD8mlGCpQp9vUUm9JRD0TInHZ+jGcUPY5ewUGSkZ4TbdlzD69Zu\n5uL1FzDQ24fX7iBbHvHcIiO2xeLyLGvWbaSvp5czk9MEcZfQqSmwhIGumURS44mnn2JucRGv4zM7\nc5aJmWnK+Sxre3pI/YBOHGObWWJNQ5Vy7JEVUhljB5JLrn8T+06dwPRc0qBDc6lC1GyQCyPyYcCQ\nYxDGbWJfdkuxDihRJwoCMrpNPp+jUMyfb8zPXWmadqWSVm69cif/3GPDMP4Vbf/cde6eYRjouoZl\n2mQyOeyMg27o3PnWn2BhvsX4iWluvvkNTE9PsfuOt2Ca5srr5XnQoNXxiFKIku6cB/FyXyQBQ9PJ\nmBaGZiB1k+/94El0M8O+gwcxyFAu9pOmHh987y188g8+wP33/Mm/+ryvvF41WIQQthDiOSHEfiHE\nESHE/1i53yOEeEwIcUII8agQovSK1/y6EGJcCHFMCHH7v/XeodugN2+RzzkMlnrxF5apN+rEpqRa\nr9FqNujtKRErl2xOp9Dr4IUtnv2n5xjIjiACQeK2yZgWuq74mfftxhppsvs9b6B3XR+5fMIPnnuM\nn37fexnbOExuKM9dv/AGshmL3/zQr3H11TvQMpIdl1zOzh0XoWKoLcxz0bomPAncAsXC63GGshys\n11hOXOaXl5j225woaey1Qcv1wc6t3LT5Yi5atYHtN1yHJmKaLZfleo2lEyeoVpaYbTWYczt4fhux\nMMsqIYimp7DaTdaYJp2wjWnYSGVTtwR5H5ZSn5xRQMSK5U6HfBAST4+jt5axHBs7Y+DFMZlSH6a0\nSF2fyU5CZyRPLCLu/6OPkY087JLFgDQxpIahdfuGIIkRskBDA38+A9uAf4bX76ogpQ1agohC9CRg\nsOxQzOkkaXenRWoaUqQYEjS6fU26slUqdW1FMznCMLtN/Cub+lcuj8VxiqbpDPbnGT89TjtMeOCB\nh3H9Brvf+mN0vAaPPfFdbrplF1EUct211yJJCWKPUIV4SbBiIqGhCYkuu0KB5zKfoWuYmqLc14uP\n4Mqrb+GB+x5AVzEbegtU/UVOHjzEJz78MXJzFY7+/bd45Atf/PcHi1LKB25WSu0ALqErDn4dP6JF\nHkBzqU5zqYG/3ES4EWM9Q2SERdYuIJTNqZNTFItl+nsGOXJonOnTc0ydnkOXDp1Wyr69R8gVewhN\nqLfaHDlxhly+QMNd5KorrmL9qo2YRg9/9bW/Zf2OtfSuK/Hc80dZWK6z58BeNmxYz+J8nWPjR9m+\n/WLu3H0H116mY+4DNsBUaHHoyCKZVKPvkq2Mmwmtusugnud1F17EzhuvYdXatYwUe2m0axza+zwy\njqjNLTI/MUncaBAv1zA7PlmnQKhpCMNApmA1m5hnZxg2dGzdJB8ktNIEteMqZsb6cHSNDamOSANk\nHDFNlchr8jqVoTq/zOxiBdKIjAZL8zPoAlIhsDIWs7UGQQB2JyDwGjS8NpECpWsopQikotps0AlC\nik4P+77ZB78I/Bm88+2zJGlKsafA0KoxpGmRCoFh2UgBGdvGsTPknCzFfB4NgSFkl9mLJPS7lHgh\nBJZlr2QP/V/QZLp7ZF1TJNd1CYOAQqFAs9lk165bkFJy5MgRrr/+eqIw5aX9x9ix/UriUONDH/hl\nMrpD5CVI9PMl3cpZ7QaJYZDJOKxet4bPf/mzxCrhs5//PJdddSmWOcDnP/VV2o1JVpf6WXpuP/2t\nGKfqkql2EM32vz9YVj7EuUmNSdfZpkbXIu/ulft3A29d+f28RZ5SahI4Z5H3r64wSfHCgNjUmQld\nJpcWOVOpsFhtYBg6pqFz843X8a633YGT0ZBS0VPo+rckSdT9R3shhqXTky2QkzmK1jCnZ2Y4PnmA\nKA7Bh//ygY/QN1DkpcMHaTY8ZGziRjEkNqODQ5imxK0GXH7xZfzKL2yDx4DboLK0mUsvvphksoqm\n2/S9bgcvNSrMNGtUpqYp5Iu0m01ap+eonj5DcnqaTCvETiRZx8QyDRxNIoOIfCYHpkGCxItSlJXB\n0BReHFMNQ3KJpJ7V6fv5D7Drzp9CpQqHhKJh4mkQ6CYNoaiSEBs2UurIlkvarCGjAA2FkpI0CBmf\nn6GR0cAGzZIoqXAltMMI0zTRHItMzmH9RRfQP7KKqf2XE+4yYBbGFjxueb2iubREMVcGdKJY4fsh\nRSeHTLuSVBnLQpOyCyunCkPTsQwdkSp0qZ1Hyrp0/pdh5nPZpcspM3CcLNValbm5WdasWcOLL7zI\nt775TY4cPszePXu46uorCUKXY8cP0dtXYM+Lz/NL/+E/krWzqFStIGtdfloul6Pc08OqVavYvGUT\nd/34XVx7/c387n//BA8++CRTU3XWrhsl9Ov0FPr5wd88gNloEy0vcnZxCalSMhn7RwsWIYQUQuyn\na4X3hFLqMK9ukTfzipf/Ly3yAOqtDs0oZMatM9FcwstoWLaF8APQUu58725kMeTW22/hwW99izfd\ndjuZjE6pN0cua7J563rcuEnetNBNna/d8zdYWYtdt7yeN+zaxcPfvw8vqNOiyYv7fsj99z7C63dv\nZ9X6IvuPPcmZs8eoVZe5eONFfPqzn+Uv7/kqvvtQN1huhYe/t8zBffsxay5zk6fI5QpsvOIq3JzF\n/IFxMkoQzldpN+usMfKsLfZRyHWNYPNCUM6Vuor3YUhfqZcl36XlB3iui8haSLvAqq0XMotPM2zR\nGyVc0Ih4/LNfIdE1Qk1gSR0jShhMbHryvTRNg7xSpJ0WuucyiMJMU9qRj65p2NkisabTbjVpyhQ/\nCImiGKlJpGWjazqaadJstDl+7AR7D+3lBz98nqe+Uepml8/Am982iabbTExOEMYxiRJknAKObaJU\njK4LNCkw9W6A2LZFwbLZuGoNYwNDGJpGlITEIiVfLICmd20QSLoSfJEPiYelh5TzGqQB9aV5Tp04\nyKWXbmHVcJmevM383DQv/vAptm5dg50RHJ88wZt234Fj2uzcsYOeQh5EimEY5PN5CoUCG9avp1Qq\nIzE5fmKCk6dnMNFYv+4CLtx5OS/+8/fYtWsHowMbeP7pp3nx+CGeOHaIQ6dPUYk9NOdHVKRU3fHt\nDiFEEXhECHHz//S8Ev+SB/Gv3uJ/dfPRZw+CJvHTGLOg4wwUMYWOsjXe/f73cGr5ONXA5I8+8/vc\ncusu3vXuOwh1nxPjZzl55Bij61az/YpL6C0WaLrLXLN1I5/9yz9k68WX89k//yJXXbOVmUmfb/7D\nPzK2aj0/PPXPPPa9Z2nVTPa+tJcPfvBXefQ7/8jOay/ld3/nN4hYYn2+AwcguRp2b/oE/oIg2Xs/\n2cpZJmZmiD1B2HIpSRCOQZKAMiUly6aqSeYXKmR0ExHGuFKSdprUhM7A8GrCfIaiodGXNTBH+jnS\nCEgT6JAwpVxWx4LHfveXWZW2mDUVwovI6Sm9UuIkCqfWJpEChUAkJpoQZIUASwfLQsYQS4FeHMDq\n+BxyXQbsApaQ5GKNOPIRrZg0kAxIjSHNYtmyWW9oVP6mQHrfInITXP+JNuu2WIyfjMlZ+ZWBY0Sk\nEuJErcC+Okkao5kGQRgy3DdAo17HyeZI0gQtFIgoIKcC1o/102NZZE0b27QIQw9NJZDEQMzk499k\nVCnSiUlOnXqWYnGY2WqL7RdvZ/0FFzC6fi25fJEzU3M8+MAD6JZGYbDA23beyZHDJzh+/CQZO4vj\n5LoKLvU6b3rTndxy+y1knR6efvIfCDJFwiN1solPGLl87APvY+PYGPuPHELYJqEUPDVxGk6P/2jB\n8oqgaAghHgJ28iNa5AHcfOkWMrZNtdPCk4Jjc7O4qUa+r8htt9+A+VJ3u3DXh67kySef49jkUerM\ncOzkIW655QaOnj5BYaCIKCjq1WVanmTjuo08+K2Huevdb+eev7uXU4cr5JwCb7rjDoazZf7rf/k4\nv/tbHwe9j/vveZS2t8zP7f55Pv+N/4EjfsCNWeAqODKd470/+RF+9rpb2d3TQys6i6y1yJo2VrlA\n2mmweGySIKcz5ClqkUczhdn5eWTOIRMm5Ab6MCxBnC9SXDeGmj5JtqdA7HbVGRfrddx2h3I7Reo6\nbuKD5pGkBqkfsVrL4CQxWRLMWBGnAkNKyqnAJ6JhKfw4hUhHej62MpDEFLIO+9t1ooxFoNoYiY4X\nuvT19EC1jiUsDDTcyjRJGFBTJqUJi1N782x6Zwu+DG952ySf/qNVQIJKFAhFnIJlOyRpSrzi2UIK\ntpOnFngYOiRBh4IhWV0cpGTrmEnXwUymMWnSIPElBTPTLVHtLHEcYJo2cZhSa7UoFnp5z6/+Otsu\nv5xEM4hQ6EnKsQOHufzSbbznZ36STMlBaBopgjhUfOQjH2XPnn1omiTVTXpGBhjd0MPQSJZHH3wQ\nldFZu6qM/9zz7L7rx/CExxsu2cGDzzzDHTt2UjQEvhXTiXwMS+P7x/6d/ixCiD4gVkrVhRAZ4Da6\n9p0/kkUeQEBKO+jgmoqFWhN0C98L6Mtn+frD36DSmqY23SF3x5t5+83vZbxyGg5n2HJRTLaQ4V13\n3cFzB/cwM71AbbbNoSWPky8tYOcz3Pu1++grbuakv4iXtMk5EcWi4NFHHuG//9ff4MDk83zmE/cS\nBW1+9b/9J971jnfQWfg6fKf7Fz73fA6tpVh1skOQxEw6El/T8aVJPVgmDn1yi4v4wxnqZ5pEvRkc\ny8Eu5RlYtQoxeZaak8FUIdO+YnFmkUqzypHFKgVT0h9J1ndCZlIXTNlFmYRNIBXIGBlqCGK0RIHo\nNrK6adJJIgbf/R4O/eNDZDothBF1B4MxdCyJb5jIIKZj6yRAKCUi6VoRYlgk0kRTJkmS0kg6yMFh\nJqKIpudRuHuATb/WgjfAO47VuPsra+i4JrFKkehErEzkhUClEZCSJooUgUwgb2foz1jkiBEoojBE\nL5TQDAu1skqcoFj2A4SXorkdDF0j9j2CMAGrwM//4se44KprSbqNDqYQKCnYtuMStmzaxNLCNH5k\nkS/1oltZpKb4s898mr17XuLDv/wRrt+5k2LR4fiBl7jrnXfyM+9/F9VaG7PZ5pg/T2wF/OCLf8uw\nnuUd11xB4rkokVIyLEQcI+JX1w3732WWYeDuFURL0jVgfVwIsQ/4hhDi54FJ4F3QtcgTQpyzyIv5\nNyzyADKOjUpjmvUWaRITBC5pHLG8WME0d7Bt+zamC0t85ut/w5urCzTbEROVRXoHTfpHCoRhwI2X\nX8ex4ktIXyCDlDBoM1Du4+ihM5x+8cWu2aaj8dS+F7jk0ksYGR7m8OwLfO2+v+Nzn/g0n/7yH3Hm\nzAT7Dx7gva/34JeAe+DJzytKPb1c3rOG6X37WHPppRiDJeLpRbzEZzyuUQgjBre/jiF1lqDHolBP\nmDFjnHKJ2Wde4jhtMu2AKjF2FDIgLGwzZWsgEaSISJDP5jhrpEwFErM8gLY4ThgJYsMmR0is64Sx\nwFIproq54t3vYv6Sm7ioXOTEl7+EFuiYwuOsZnLFh38VM3B57HNfIDR1HDRSNExSIpkyl4ToCtww\nJDE0NMumnM1hFfNYizWWT+ep5ubp2dzBfCjmTW9Z5p6v9qGQ3UySxF3HrzRF0w2EUF0ULlFkMwll\nS6IrRS0SmKZNb18PzWYdEQY0Gx2QKYi0OydRCkNKzFTHTTtkzSx33Pl2Nm/aRGVmlqGR4a5h0gpL\nWZGSmhqprhP7MbWFRRJVwSn3gYC+vh6+8udf4MD+fTiGxtatP0ZraYlSXy+F3hzZoknPxWvIWQ69\nVh5baFSbVY7PnsHrhOzcvh0VRSjtRwgWpdRB4F8pjymlqvwIFnkAepxQsCycVANdJ84YxArcdp3G\ncpUjp8+y64bbmD1Z57uP/ID3/cJPUunU2Dy6jsu3X8Zv/bff5pfe/4tct/V27v/277Fq1RiLs7OE\nYUzg+xi64s53vZF/euE5Jien+IuvfY2xoTJ2j4UIDf7wc3/KwkKNW2+/Ca9zmAE3hCp4WzSOHtbY\nmMkye/AgvbrO/mef5XUf+nke//7TCE1x0+23MzW7xLFnX2TY6OXM9Cx+pc7BpE0qJW5jgXLWYBSD\nNX5KLVYMlPppLs4hkSyJmELWxnRsCoUyW3fewtCmy9n/6Y+QMt6116aAGemkwkXFCbbUePLebzOs\nl6kYZTa/78O0dB0z1RhwBYtVi75mFWv1KGPrS7z4xDNd8mgagW5QKJZwMnmySifJGBCEK1CrjkoT\nIiU5+dh2rvjlZ+CT8LPf9Xji4T7aboCfxoQRhGFILpddoYgJ9DjANgzGijlyxERRQDNJCNOYTqVC\n5LvEcYTUjC6tXuuylnXdYN3adczOTBOEHT72X/8zA2Nj5B2HgtSpLyyQSLBtB8vJIlBYhsXA4DCG\nSImSBKnpSDPT9cHM5wHYcuE6Yi8AvUvsFKaJihRK0zD6BhjadBF3/cFmFmfmeOwfvo2+1KSUtRl7\n448x8eILuOP7XvXMvnYie0qQeiFlJ0eP7WBJwUBfEcsy2bf3AFpgceLFU1Qmljg7vcxTzzxHY2GW\nxx77Pnd/7W8Z6B/mG/d9g9Kg5PrLdvCm22/EKphMnV4gb9t86e4vcsFFWzCMmGwujy5Nbr3hJkaH\nBnnPu3+an33bz7F69Ua+8537ufmqFL4H3AKHTvRw5c4rWS8z5DoeZrNJf5zwg69/HcPt0KpX6cxW\nWD49yRVaLzNLC4xUWqz1U64UDjsSk8tyvWxrCxJ8HEsnDlr0rxqi7Uiigk1fYtCrW4i2R7tWIaou\nkPPrWGkHPeknCTOse9/vMfaR36IjFKGuI0RKD4JCtUkxN8LRMy0sLYfrCzqFEk5e8Mgzj+DnNTpZ\nRWSB0iSlUg8500EFAaHXot2qU63M05ibJapVSV0Xr93Am53nqb+2aV0PzED++Dxv2A1SKBzLxDRN\nSqUSzWaT0PNI4xgpNXqyIUUjIVEhiVQIkeJHAW7YIVIhURqCVNgZC0OCY2uUB/q5aOeVIBVrx0aw\ncwZ9Q/2kgUeqKyxTJ2caqMCnubjI/MwMM6cnCIKIdqzQ7CyaaXOORiCkXLHt0+i4blfAwsmilCKj\na0RRwNCadWBb6MODDF1+Ce/5nd/k4ltfz6SrOF6P+bmPfwK3Z+hVz+xrFix+EuHHIYHnoXyPkpMj\nn3UYGOzhtl2vZ356mQN7XmJ2cgInZzExMc3EsSk0Suzfd4JmPeXMVJNPffkbfPPeR/i93/wEOXMQ\nQ2bJmxZPPfNdHnviYb7yp3/BwEAvN9x8NZ/848/QarrceuU13HTdZdx08zauv+FyxoZOnp+vPPpE\nzMG9e7hkbDNxNSArcxRFgUFNQOQy1NNLc2Eed3EWe2aOmr/ISJSySVgMGhQ2KQAAIABJREFUBinL\nS3PMagnCzpA6WXrtAsLziBsN0kab1AuJVUwnSajrKQu2xvQLT3DmM78PnkfD1FC4ZAsDREtzJImJ\nrws8GVOTMdrW11GvThJXjnF07z/T8Kp0vAQqi4RnJykeOMnonnFu6h/C1HUCz0MkEXGzg5MKdNcn\nh05eN9DjrnKlCkOGM+D4Bi/+4xj8B+BL8P4POZRLeRzTwrZt4jhmcHAQx85gmRZKKvot+P0v/xmf\nffg73PiOtyB1jcT10KKYkXIfRcvCzJi4gYtjmdimzuLyMlahiC4EUwuzjG7bypG5M3gqQnMMnP4i\nViGLZupIBXk7y3D/APlCATub7VYiKzpN5+gzQghiT2EmNrXZBrWFOpWZs8wcPc7pA0eonjlLZ6GC\nUilGDFLovP3XPkz/RVvIRxoyDXh0OnrVM/uaBYsXKaJkhYCnNEi6Huydus+D33yIWqVGu9Uil7MI\nl5cQbZ2rbroK3ckyd7rOsecmWZ7osOeHe8mZOTLCYfLQCfKahZPv5Zlj+1i1ZYQ7f/od5JwybnWR\nu7/4dYYHx/jiN7/E//OpD7P7tt1sHZtiOHu0y0O4DQ4dyBEFAb07LkEbHqDRajGX1VB2gXapH3to\nlPZMhUvtIqkecfuuN9C/eSvFjesYHRulGcTkiyb1zhJZL0DVWph+QhAleJrN3tTlsIg5mIMzXoDR\nM8zqkVHCsElLU2iRS1baHPnUzzLztU9jaQGhSvC0gFU37abpRgjhkNl2HbkNOzHMIqG7xJMPfh1D\nBDi9WUwlWbNxDEOzSNKUXKyIIh+hCUxNYaUx0pK4XoMojDCETcNzsS0b/+A1cCfwXRDiCd7zoZ+j\nxzQp2Xkcy0ZLIpyMSRh5lLKCDat7cPIdQhnyzvd/mKGhke43vqahpCBBEocJ0nQwBAwbCiFthoaG\nsGRKyczxZx//U7ZdcCnCNJFCB6WjdB27mKcwUiY/XEYWDBKRop9nDHTXnaE79IyiiCjwiWVCHAf4\n7TZhs0PqNRgcHcTp66XQ1wtSEho6qdJQsYtZkAgke/ee4JqrL33VM/uaUfSjKOmiHkIjVQozm8MK\nXaQmqEYe5XIvbrWJ2/QZGRnk4ItHSKTP/OwijmlCpJAqREU6sZ5w0TUX0gzrJM2Ivfv3YTgaO7Zd\ngKmDbQs2btvMPz7zMDu2bOWS9Zfy8T/8Ew7t/wU+eM3zsAu4CSZTk5OndCwh2X/vQ1yU0ZnUfaw4\nJlMuM7plI3ue/SF5dLKxxA1aPPoPD7JrzUYKzRbC87Adkx5rgFR3WJIabRkxqSWgYoLBPqIo4ILe\nYZy8w1LBZ7LtkhkcoGZKZCS6U3IBIjVIhY4rXZzIJGM6aPr/y9x7B1l2VYf6394n3nz7dg7Tk2c0\nmpE0kpBQlkAiCBDYYJGTcQJjA8bhZ8CGR7IBG2QQwc+Bn22wjQy2iEKAJBSRUBxJM6PJPaFzuH3j\nyXvv98dtCWyD6tVzueRT1VXdXbdvh7tW77PXXuv7NI9++VPk0hgtNVp4yKhNU1iYbIVS3qHVDXAC\nzajOk+9ECJWC45OkGSmSIArxbBvfLeJTxCoNYMahG4ToKOHBr81x+Xt98iJCHKpz1VVD3PrVPhrN\nLkq5mAw8Zajm8rhCMbPQ4cuf+CtU3wSv/s13cuUvXcPBj30CWwpq1QIrSzNkVrWnnnAMnjRUSmUW\nZ+dwbRtjLJJOwIff/34++KcfXmtbWevvlwLRE68jRC+onzy1+49Nmq7j4FZ6trK+/hroHjvs/rtu\nZf3pp2GcPEb0sLOGbG2kI+YjH3kfP370CFMzs1x77cv52Ef++OfG7DOWLFGSYSwJvk9S9TgVNvGr\nRRxtcAJFq9mltdjC8nzmTq1gxS7dhQ5nbdnCKWeR5aU6+ZJPa6WBcgTz8yeoTQ7ilQqYVNBXq7Ln\n3gOI1GJxeZGbvnMrh4+c4sBpO3jp5a9i4eQcV5zxAGwA/hDmXi35kz8apur4jISKrY0OpfFRFpc0\n24Th5N7Hqe7eRcH36Pd9gnaHNE0pCodTcZsRS9K2bKbDgAVfc6K7QuIVcQfy1OwC7vpxJopFDu3b\nR7fdQocdvHyOUmOV0LWoC0FiFFIrhOOCyRDGIpOaunTIZwGzt36LzbHBRbPi+VS27uDEY7dRdmxC\nu0dd7KiMFIu5hx/g8gvP4cGH7iVLMuycT3VslI7KKFQHQAjsCHKFEq2gjc759BVKDG7bRGdOkH/e\nj+EWkG+9jWte9Wq+/q3bGLXhkQcfYGhkgPmFaXTXcOexaR44uEitf4Db736EN7/z7RjLI0Ew32gR\nGYlWhlLOQxJhVMa69ROUiyUsIUlTRalQJHN9/vWGr/DKN7zpJxRMoNdhtTY7w3+uVpmfuhWDNWZA\npjBaMTMzxRnnno10PdI1jpjBIJXg2IO3sf873+Liq6/kknMuYHnjGJaTe9qYfcaSJU4zUCCNRbcR\n4foWjfkljNak0lBfamKlAs+zCGOFLTwWjy9QdmwWp+eZ3LAOJRM8u4wnHZpRSNKMCE2ClXdY6tZZ\nPrbC6Ogwh/ZOsboU8LZffwefvf46br7xFnKuhbTpdbr9Drz5FevJWhYX797OWYs2g/dOoWZnuOS0\n01k8cIxRIUgbTdpTR+m3Xdo6o+La9GmH6VaThpMjEZrVvMvWnVtZpwyVYh+NNMXu1FmYn6NvfIJs\nZYUgESRJh8LYEIPGsBxFdHIuSgsqwsKono3LkpLTXvA29NgWVr7+t5QX5mlaisxyKF38HJx1W2gf\nfRShmuQLVWTaIdUaZdu0U8PNd9/C5MZ1+C1BF0GQpgRK0Zib70EzjKAetylWKhxbXKajUvLlMg/d\n4nD1VcANYN7wLYLpjdQqw2BnnLXrbO6773b6B8os15t0ZI7lTLK00MCZX+HeX3k71hoL+eRyE8e2\nGRsqYlu9FUNKRRJ1yOVKiCzFc12MVqhUUyzV0GtJ8aQ8SRowQv6MNOld/26FMRotLIQUaG1YmDvJ\n2MZJjLSRWqCF6fEJtGHq+zczkobs+ee/Z9vSHCPPvgzKE08bs8/cpKRlYVuSTCd4nsdi0KLTCigU\nynRWO/gih+UZklYdbQEiZKgyQBIabKvI0myTOOlQLRZxKxlpGEK7hLZS+ko1Vo81oAvzJ+aoVovU\n5xa5/lOf5qKLLuJNb3wtb33HrxJkgrxvoA2/+drX8tUvf4tqrcSBHzxE3tj0KZvZY1O4hQIjGyf5\n4UN72GoVsVWANIay0uSVYrRSYWDHRor5HG5jlczyScpFomPTLBZ98vNLtCxBog2ZMthCMpqroo3N\nahqyHIeMbNvKyftnEZYLmaFjW/RpTTtapdpYJVg6icCghEFJydSP70Td/QMKMkJ6OewkQFk2jiNI\ntcb2HCy7yOqpORynQBTDilTIKKakbcqWhWdprM4q/W6BmUSx3nXplz4Hb8q4+u+Bt4GwfswnP/cJ\nFrsOOo1IVYxtSVbrLaRXQGaGTCdEUpFiiI0NGGzL7uFVHYdjC4uctWEUS9ggUsLmCvc98lhvolIF\nqCxjKWwyPD6BEZIelUyscf5+1nryMy6jESRk+IhMszRznHWTo6Q42FriCE2MJCWDTpc0bLGqEpSX\n47E7bsNyNCNXvPBpv8Uzt7IkGYklek6/ekZR+OR8izhOKVsaXwgGijWs/hpBGJJzXfr7aywkbdIw\nxsnlKdkediugYXwmd20gtQKChiJoNOi0AzIR4SY2aZQwMVbjd9/1W3zmf3+Wr33H4rQzzySOj5Ov\ndaAOS+k+Xvnmq7nxU1/h4uJOWo15cp6HEjZxqFln59Ck5KwcWWLjFMo4QYhWCWFzlajeh5UmyHbE\n1I8ewLdAtTtYZOSlQyHTlKVPw8nT6HYYLPjE2tBIEhYXOmw8+wIOuUN0pMFJu1hZRmZs2nd8i9Ww\ng2/nCJwiIgvI0g6uiQikTd/wBnJZiBEKy3dwhU0sBBpBtxkRpE3cTGHZhkazxdjQCHGjjePmiJOM\nwMpoCk3HV2R9/RT8MusaNkc7Dps3pHiPw6YdTVbvy5OomLwDoRLIQhWF1VNTJBGOlVKrVqkOjWPb\nNsv1ZeIwIo4jxvsHUEYyuG4Twckp8rZP2XM4kqRYWUI3Cpmfm6O9OIc2vYlI1igxGDBCIHs48qdd\nYTQSKcBIxcLMCXZfeCndRGF5AoNC4uBpxZc//D7spEPec/E8H8tIHrrjDtbH/0OrYXpttFRhcIxh\nzM8x6vusy+fYXupnfaWKo1NUGlFxbbwsY3FmmvnlVdAWWRCRE3myvMu5z9tNTEyu1EdHR/RP9nH1\ny67kkpedxfrTx4gCTb2b8cDex/jg+z/Kq1/2ClABfmEE+oE6pMeWOHTfE1xV3cFQCOUzt8K2TTRG\nBzk1VuGEpRizLESnw3JfhZf/1WdYyeWQ0lCWNtHCEuHcIu5SA9NsMTk2SqmvhCUEqTJYmeqVTitF\nUiFZCTssBQFB1GHdWB+n9jzK5i07OO/Fv4i2HIRIyIxBhW1Cr8AZb/5dVM0nswSRFmRCUj19B/sr\nhv2FhE6niT41x7C0ed3zr+ZtV76ID1/1AtZJG1dq8jpDhhkJhlYY0MEQeRahgKbJOKkkt59a4IfH\np9h/4jCP35XvHTvfAhdc2KSbJiRGESQZKstQWYolIE0z0ixGoyiW8jhub2Vprjao1+skSUK9u8rU\n4jK37zlEXB1iMc0oDVSwPRfbcpmZ7akG9/7gJvb/8HasNTpNxhqvWKW9JNC9w1CQaxt4+RPSjAGJ\nhdSG9vIitUqRTBYo+AUsIdEGNBo01CqS2+56kMUgJkXg5QsUK0NM3/fQ08bsM6eckBZC2kg0mdG9\ndgNbY3kClSginZF5kkYckQsSSm6eILMpeA4qb4jjDs04ZHjzMMudRaTjM3OixdJsl+0b+zFOxhlb\nzmbP3f8Mnk03EFxx6TXsP34YlTXZfc5OllYfYrIGrEDu4UWOfW+awkCRsy+9mvrMcep2zA8ff5Rd\nu87gjkfvY3eqsKVkfbXMzG23MxwmVI2DygTacoiSlIqSOGlCrVJmJuhSzZfQRXDaAftnTxHmCkjP\nR1gghMb3ffJByqpq4uZ97v7mPzIg81jao+vaaJXH3riFfccPE7QXyTKFbbt0BcyvzhEMlUiNxTWT\n2zm3OkSzmsM9Nk+hHWHCFq5tsHVvHxRZFqueRcOzkN0OsSvw8jkqrsfFV76Avffdj9dXZHOhzNLj\nwFWPw5/C+Z8OCD9eQRiDwCBEhqVV7zXTMdIGx3YQQtJYbVKvL9NptQFQAprtCG0kLVvQPdhFOjnW\nnzhMHCvySuEVPMZGN6PDjOl7HmFl73G8aoXa+Dhj27dSGhumow22ULhCoozACAnGwqLn505JwQJX\npMzNHGd8ch2YhFQLLPFkmcAiCRqUfIfS5BbCMGA+jamuG0TbFuVK+elj9r89K37OpdKeCTfKehST\nxLbIPEmGYjVokPeKpGlGkGr6q/1k7ZhcLo+QMathG993QWXEi6s0hI8q52ksrOCmNscOTFGuVdg8\nuYsoMPiuAT/lSzd+ie07NnHJxefzsU98mFdfNQG1U1CHS97xOu67/yYeb5/kiscOMhcv0LV9dlWH\nWKjPsl3bKKNwkRSmFzjxDzfQj0ViKTCGbquF6StRKeSoxRlHpmdopSlWt4VVreJKSXVilPXrtnLq\noXtxohjdavT2Oc2YNO9igoCJqEnLleREjK0iVKLQhw6xcnIW45aRsoFQhk5R0Cmk9PeXkdUyjxxZ\nYJ1yEd2AfGKYFyH3lDo8kcswKWSuwLgOHZ2y+awzaJ9cQDgWgRTUV+tkjx9htFxhauoUwmtTUiOo\n9z+O9TCcuSlmoJxQb/YGuzKToeOQfM7tlbANRFGIY7tMTc+RJRHwpJPSgNFYdk8TsdII8HzF7T/8\nIeO+jYOgubhIqgRWsUot1bSbTbLlFeaOHmfq7jsYGKty1mXn44yMIws5MmkjsFFC4aaGRr0Fto3r\neDQ7K6zMzbF+/WaCuQXKE6NPUTK1ELQWj7LaiGnl+hkuhRhjmFrqsH7CJrGdp43ZZyxZEq1Radaz\nRMUZIT1/iCVcEmxazTYq620UF5oN/AyKdg4jE1yrZ/kySuF0MurNZaqTo4x4NTpZl9ZcnbO3n8FX\nv/IVan0FtmxaRz1cZXpmiv1P7OG2W79PrVKj07iPAkAbDh38Nhd5fZxIHCrLK+wql/l+VCcUKecs\nCfwgwbYkfcKiGmsyYZOaDGF6YDadCrQRhI6krG2eWFlm/Zk7iZqrpIUcVmiI3DzLs4vMzs5SNoZB\nY6GSkKKfZ0VpBqpDzCwuUTnjXKLZJ8jNL9MtxeRDBeU8+Z1bWLn7doRvY9b1oTxDFAe0ybh/yGd2\n/gT5pZDnbttO7blns27HGMt//Fn8xQ5OOUfJclhstWmVA1pBBztRBFKSWZJG6wSDW7awYcfzOfr4\nEwwkLvuPepzxrBh5N1xwccx3vtMrrUocjE7oBqtYlgEJntsTCbXbbWzRm4y0LKtXypU8JUOybUGc\narSVoynBLeXxhSENQ2T/CMtBl5yXBw3tWOEoSdReIZv5JiU/o1Yu4fQV0Y5FYjxmFldZbiVEiUVo\nFBvOmmR86w4+/+m/ZvvIZl74xmuxiw6ZsHFUysrULHsOHCUSNaRK8Up9fPWmB3jjC3aTHx162ph9\nxpLFIBBrI6iW7dK1NHg2cZwitIUrLIIkAqGZTxtMrBtlKYkZrhYwShN2NbZwaFkZnqnQmmoh7JSS\n7+Bphz233Idfi0AqFqamML7N+jPXMT4+yr59+3nFCwXDJ+fgTtDXQ/onozQX9rJlaBg/9Viu+AzO\nCyrawk0CbFsyahzyGLQNek2vZ5DITGO0xJYufRsmGWh3aJBwzoUX86Pvf4OFpQZmaYllnTIysZWi\nsJBZhCdtSrk8rSAidW2WrYTh85/N0PBWnnjsAfAkfpondmLSoM7cw4/gbOqjU/WIshgLi26nSVcr\nDrY0J4qS933u41xy3qWkpDz0yc/h9Q1gFiNkGCCkh2tplmbmKJYLpJ2IgVKFUBhUo8H0448Rjq4g\nleTI9Ay7HxiBq07ALXDJpRE33ZTvvXbaoLXqJYq2npp9f/jhR7BtB6Oyp2j7BvHUIaNlWSidgYAt\nO3ZRrOTZOFDlru9+E5FmNIfGcZKQzDiA04PkoWhGsBDlqAqNu9DFpo0jBUmq6Tg2890Y2y7Q7qzy\nojf/ItLro7MsiEZyaCxYkyoZJPNTxwhSGMlHnGp0aNUF4xOjpEHMYqP9tDH7jG3wVc9QhzKgPXsN\n0pYQqhRsTXUoj3QFCEllpMpqEDB/qsX+Q9PsvvwC6nFIq5vjil94HkGWEkcxUjn4iaHP1+w8dx3D\nI4MMFirkY4uyLDNamOD0LZNUcy7XXHw/vBX4GNx99ySPf+Ug3bzHOZufRdfTVLZtYjJfxVnt0pYW\npYlBKmGKUAJe8zLG/uK9NAoWKkzoCo3MBCcbIcdOHmd6apq5UzP8eN9jrEw1KXdjNnsVJkWObneV\nKF2h6HoUtCBwYEUmGBv2Tc+j6ssc+vYXsNCYWJOiCaRg1Y/IKiHduE0+7TCSr+BG4IeKgUCwvlQj\n38n4h3d9mA+98dfJOpLnv+KlLEcrWJUC/ZaP1AnDro2wLPLj4wSOYm52ClYXcaOYftenffgUthJE\nXpkD9/hPbfIvuSzqtZlo0BZIG7TQKFLSLGap0SIyGm1itEkxZCidoHUKOsOoFIzCShWWNoRhl2MH\nT/DE/qPIOMFKY5TS6MwhTTKyJCGNAuIkoK1jFgk4rmOOxZKDsc3eQHJAOZyMBfNhzFIS0kGR78sx\nM7NCrFIoFiAnETLDxiEzEUePHCXwyoxIA9Jj/8mTbBgZpGul/NNN9zxtzD5zpeM1TE6qDGmokUWN\n9DIcKQjaESYVCC1JQkO8FBPGIcV8gY27NvDw4UcpjhfIVkOOHNtPviIp9Q+wErVpOIqXv/oVfO3m\n71PuF8SqQ+ZofFdx+vnbWawf4ovXjeP/9X2QB/X6AQYf+jNWCu9lB1Xs33oxxXedhDsfJ5e2GKmV\ncdoRxZNNZqugk4T2V27k4a/ciCEmKrhEWULiBGRDBdrtFXbgUOoGnEmOtlehnSqOBEvMRm3Gav2M\nqBxRN6EtbepRSGx5VKuDTJT76XS6RNKw8czTiaMmT5w4SmV4gJInMUIjleTUiVMESR0cF9t18RyP\npKNAKrq24nCjTph0OTpf51+++T3e+6rXkkSLZHaBTqdN2olYeXg/OwYG6RQcwrBL7OZYUYpyXz9W\ntcKGZ13C3kdvI9p5CH/asNFXDG/MWD5qr52k//sy7pMryZOQip9GHykhMcIQRzGu7TC6boJApcyt\nzKO7NiZLkVmG1oosyzBGoFVGZjRp1tN0W5ZFJiTCGDKlkNLCUuB6Hlmm0TKlVPRJEsX+vU+Q6oxS\nqYRtWxgUmVB4wmFmYYk4bBEkmlzJJkkMOUvRjRXN9v9Qin6YKmJlerJOI1GpIMs0SkEaGZYXOmvy\nTYFtXFwpsEoZgRWwcedWLrjyMuyyYW5pnm2nj9NNusR+yOve82r2t/ew5fx+zj7vLKpjoxTHhphf\nWeGGf7uBxZO3kV/+V/gA8Hl4+PAvMtsIecmb38gVuWHUl29GtALaScRK0iLQEZKMIBO4XZ9Sx8Iz\nKZNGYU0OceFn/hev+NCH6GhBd3aWcG6V9TLHNrtAWK9zSC3zyOoMDSul7OcoZRbRYB8Hc4rjHlDr\nY2zdJDljaJ84zqbaKDXlceqRvQRzc2x/zrPYes5OhO2QxIajJxex7T7K5Rrv/8AHUFqRqoQobjC0\nbpwvfevrvO6tv4ZTrHDRZVdxbPoIv/CaV7KaRNhLbTYkNmfi4wUhB5dnWam4zDsSp5Bn44b1+I6k\nvTBD+4mHCI+fYvqRMlwB3AqXXBSt9a3pp2xfTyaJ/gmv9Sl215OESN/PMzY2wfr1m3jhi6+hUC4j\nbMH55+ym2V6ilca0Mk2UKjI0mVbEaUKqVI80aSBJM+IsI9CKyGhi0+stTLXC8jw0mi3b1+MXS0yd\nmEVYEi/n985pDGijsTJYatZpzh7FtwSNTkKpUEaHDcJQMDr2M9kqP/m9/jsT4umuTAoSDJFKEFav\ntcG1K+hYI42D1B5xkKCJSLoN/LzFznM3MjFZ46EHHuWBHz1Mq53QbMbs3HU2k5u2sH58F/sen6bV\nSPilF17LzLE6p2aWmZ1fpej2c96zL+TVL9yP+F3gV+G+VpH3/cr3OP5nNzDyLw/hO+Ddv4fESem6\nEpFaVCKJJTzCWo0NH3w7jw15WLEGZeEvhqSxZqbZoA+XUS1pO3BL6xTOlvXsOXaYmWaLWqVMLXNo\n6Ix7TZ3JzVsZHJ1g/cA4BQWCiM7iLCOORBZcYqMxpFhxwnnnnMUPbr2VXKFAvdXmzAufRddkbNy2\ngQcevpeBoRqvftVLufHrX+Xv/ukr/MuN/8buCy6gVHI5dGgvEyPr6Sw18ctFcq6HlHbvFN6zaJuM\n3EA/O8/ajZdkdBbmKIsMJ2gyvTCDt2Uz+x4afepW7LJLwx6UW6mnAHpK9VycltXDqlarVQqlIpMb\n1jMxMcHw8DADg4Os1FcRUrJ/3wEuPe8ifumF1xCvNti+bZKh8TGGJ9fjeDZxHBKmCbHKiJOEONUk\nKiPJNEmmyeIMnYDJIAoThOWQKgHKojRYRgiPVqeLFFAtV3rYKGEhtWJxZo4kU2zbPIkLHDh6mGKx\niAk7HD5+ilq1+nQh+wyuLFFIN4jIlCAMExqrLUycUS1WeyA2NBaKouVSLQ0wVBime7iLrqeo5YD6\n9BI6k7z+NW/lpS98Fa94/Uu4+Bd2cuLI4yweWuLPP/m/ufyS53POuecTJTF+0cEsfoHNhyL4EUS/\nB+/+SJUL2zVO0z7dpWmKqSSIQ9q2Zi5dBatArAOkbXj2xz9I/ZzncMkLXo+T9xBWRinNmPrDD3Dg\n859FepKCLlIjz4hf4eGHH8QUXGzPZ0F3OeorNg1NsmXDFvo2rKMTJNTbq0SLizRXVsjZFgP5Ik/s\nfQQ7l2Nc2th+ldGJfrZt2chS/Qh/8N7fY3V1li1bxojiiNNO20Y7DXnOtW9gZPt5LK90eM3rf5mC\nn+fL132WT73lV1j88hfwxXGElRL6glMmpFGwGBvoZ2d1nOXpOZamj9PQEX4rJbQktdoAY/3jbNp5\nPnsemeyRF26Byy6OKRRyFHJ5crkcruvieR625eI6OTAW3U5ImqbMz8+zXF+h3mwwW19mYsN6pG3j\n2pqpfffyyQ/+Hu1uCyTkfYdWq46wJY7jkaYxSRaSmJ6HJU1ThDQIqZGWBgISQrSvSEUbTYs0VVRL\ng7iej2tFVPwSxeE8Qhuk6bEzZ6eP84Zffj1l22BsFyUFY+UK+9xRrnjTO7nixS942ph9xvYsni3o\nRAFxHFAsFinn81hGsLq8iiUtsjRmdGgQXzjgOwxvH6JvIoftSa4YX8eD9z9K2kn53r9+m3/+27/h\njOedQdO08Qo1ZmdOcMbOHXznlm9x5LFDZA2NM9Tgt17b7rFpPgtf+PsBxo85vCapkT/ZZCUVJK97\nGZ2/+RKiG+EIn6XxYfzFBD+Cm977fgpXnM+Jm75HjRCZWRgV9/i6QlNWEsux8FRG17aQZZ+TjXnG\nzt2FWG6yrlRGRprOaoNYGdJ2m7aAMdcj02CVCsxkERPDw5hTIY7IyBzJsYNTtDoRl1/xfD760U8Q\nJSm/8zvv5vrr/4KB2gRZFGPFmh/e9G0evf8ObANOGNM4coLdI8O0141x9bW/y/dueQ3dxlFqtRoD\nTo5Gu0U3TEltjaxO4iU9V6Mu+cTYzM8e41R3BZbatCs2JZlRWzJs2BHwxP5cz+a1hlfouSMBDJ6X\nw5A9BeiuVqssN1vMzkzRV8mjU83miW10Vk8jyzLiToZr+4yPTNJutGjGq2jBmvceLMuGNYqlI3uz\nLAYHWxtSYjITkyiDa1VQOiLTvcLDkSP7ec/b34nrSjxPksv5aEt7tlcSAAAgAElEQVQQ1lvowObA\nkcN0SwWWjs4wumMTf3njjUj19LriZyxZqjkXO1No0VMUxGFC3s9RyBfodCN8xyIOAxAOWSKYmzW0\ncUi6MWk3pbvSIOlmnExitK3odBTTS8t4sY9jPOxMcs75u0kWM44vTPH2t0zjXpfATji2zeH2P+zj\nncWNNFcUectQrzqc8buv4dTf/QuBbBP29bH9dS9j6c++gHZgbG6Z7MbvsckkZMqh4wrsVBBbgpbI\nyDkOky+/mnXbd3Hw//97XrTpdI4t1WkXB9HaIVtpsdhtoRaWsTdtZOOWSVpTJxFG9FyLSZeunZCv\nDpCTFpbnsuRb3PyD25ibb/KN73wXZQponfLnf/5JHGnxR+/9Y9xCjt9757vZ2j/Cy17yQvqLHiem\njnDx865g2/MuIzYShcOQV+Bx10FmEZ1uhGUMFRsaOsPO5SnmS6jlVer1FiKJ8VVKyfGpjo1x8J4S\nz7pqdW11Cdn3qA1rcG5jeqDwJ41gYRjCmptRCEGz0cASFlmsiIOUdQNldBLh+TksA5Z2UYmEtT7j\nNXg/WLIH83NsJIYkUzRbK5QHBxhfN4odzeO4vTLD4MQkueIQ0/MnSZXirN3nsnjgBJ4UCKVIA0XY\njlH0Ro8j5eFZOTKZo+rkOHngJJMjQ1x+0SXc+f27fm7MPmPJMjI2wuLiCs1Gi07QBSSN1TbKGOI0\nQSKJgoRMZuDYqIUOzeWMoZEB6tMrmEziGB/PdgiVxaN3HmB0fJBWfRUpNQsnZxgdHqG11OCiCwIu\nHa7DZ4CH4XMfW89r+rcyPC8Zu/4POPCeP6G4GnLsr27Aizs0qhZBfRX10X9iuZpS7qTYrkNOJRgp\nsF0PL41Ilabt5rClhRvB+NZdBNVBTnU6yMcPMttoMDw+weG9RxgSNoWiTbVQItSKnOUTOD5hmpIz\nBmlsKqmFygyR75KPNNdc+3IW7YwoTlntLjA738HSLsvLS8xOz2CUopF0yXW6NK0mi1OHOO+1r+Ss\na6/utbVLQyH1IIyh7HLpc57L/of3MD87hy0FFWMzXi7SSBL8nM/83DS+41OSLp5R2O02fYNjHP5x\nfy9ZboDLfifgrz9fQj15jrLGL5ZPtcr3NvpPzpkYY7CFwJEeKrbZtWkDnXYHLWwslWFb+bUWmt6b\nEUkP5icFCM3W7dt51RvfxNbTTiNb+36d1gyf/OCbybJllFBMz55iZGwd5170FmyR46orL8eWcNvN\n3ydMUrpBiK1tctKm3aiTRQGxDUHSxKkM40eKE3PH+acbDz1tzD5jyfLYocOorKevE1auVypMNUaD\nrW2U1mjtEEmNMAohU6Q0zB2fA2NhMoHQgkKtTJ/rkqUZajEiZxyEK2icaPGj47dTKzr80YfqcC3w\nHrj78DDr6zu5aCHEecdrkQP9TDQTTqQR+tN/QyMNEPWUsjB03AUKmYVr5VHG0I5T0kqRqFqlU1+m\nHKUok+Eam5pls+eTn2FZJJTCmLpKcXwHd3mJft+nqzOibtDrBlzpUF5fZqXRoOS5TFg5po1mXgWE\nYYuBaoHcqsuX/u4GTukAyxXYuSpRIvBFgjCGuNPt/fe3M8ppSF6H1PxDZMvfhJObUP4o0q0gi2Mo\nv8jZ17yYqR/ciUwyCr7PcqxwnDJVz2X/oSMUxscYzfnkMtDaIKol4iRmsbXC0qMT8O4j8DY4/0sp\nrivoxvqnHCyq10ovBAYNBoT4KWi3EiRJTNGNOG/XFr522wNI7aBFjlQLjMzwXIdmp8m5zz6LDZOb\nGRkeZ2JigqHRCsVaracalwLLJFz38bdSKHSIYxuVCnSoWZ1tsO/umyhmionRkMvOs7n6xb+CZQ0i\n5QjtTsw999zDPbffw8pih/bBg+zevoHF+hKDfSWc0hh+yeaBhx//uTH7zI0VJ72+ISn0UzBp6JX4\nEOC4T4pzwPNtsiwmM5pyro98Lk8aRGRJzKte9VJuuOGG3tdiyOU9hsp9jA8Pk+mYq16yl+J36zAH\n7TcJvvK+03jBDFhOjoev/xIl6xtsM71u1KUoQDg9qohwDUEGqecirYg0i5Bujtz5z2bbb7yJ6lyD\n77/3fTjZCjYpQmZ0myGFnI9JuzTdImEc0780zdjkRh4/cgjX9hCpYiZskB5Z5bxSHpEZjmUBCzoj\nV+7DKTqsW8k4JmO2rtuFZwLCeAlpxZg8GMvgSIs4Z6GKNuUw4rRqlXNHxpi/ZZrOdx9mQB+jNTbC\nyrVb2PLSDRgj2XH6Wdzz6b/p2dFKNZySZqrTpDpfZ1xalCJFvlQiUwFhYlC2zfYdW8kPj+GOjtJ1\nD1LYMIe/F849L+aOu/596PxkJfmJAOnJlceWBk8IrrrofEyakCmLDBvQuLZACcOmHVsY3zDBuedd\nRLVapVarcfToITpRQNnq9Xc9+siD3H7n37DSOUjBKfUwsEZiUlAmoDm/j6P7NzA4cDq18ixOdivh\nisF384iu4vzTCly4+yIK/Vtxc1sIo4BbbrqNW++4g3rzOML5H6rJAwvLAmGepKrbT4kypdVrjfA9\nH9excRyJ7VSxbRuMIIkj4rTFwECFvfse4uzdO4njmJznUCyUUEj2PLqXgVqdZ5/1WE+W8VW4674r\nufhUiRd8+N10f/VTnBOBiVssVhNkJGmpFJWleJaFE3eZHt7Gc9/7Pu78y+uxju7D0ZqZB+/h8cN7\nqboForhOZoGXSTpG0S5IDIrYK3Phb7wTRwoOfPl6NuXbREEXZ7SGjaBkJNFci1BKprMOaV+JcqVI\nPN9hfL5BSMoZuUHmw5gzzr6IPT/6N2r2Cjk7JQhdUAbXctBzAS9OB7jsrMvQ9Rb2ckizZLPwwtMZ\nfssvkMv1o7waxiTodhM7ynBsF51ziOcWCXTChJenL4uZa9TZsXMD3WbKro2bkbkcxhIYUohSlk7t\nonDVHNwCl14RcvePqv9JZPTk9eSZy5NKCCk0I9USZ2wYYbBUotsJ0TkH15V0gjY7ztiBV8nz/Xtu\n5R+/8Y9s2jyJYwt27z6TV7709WjtIUTCg/fdzsHH9hPGNoEfMVpxsYxNmmREHY0lJEvhErZ3PoWB\njGbjYSw/YrV7gLwTkZcCS9ZJ20dIu0XiWPLcC/u4/OKXU6ichrH6+NLf/vzR4mcsWSzHAiOwMCB6\nQztCSwqFAlIYXM/DcRyKro2UGi9nUy77SOGjTUitthmtAqrlGsXhQeJuRGO1SRBnrKw0GO4f4A2/\nOoX9CQMvgqmhGo3PD7Lt+AJHPnAdXtZBlSSKLsXIYT4XYOwyYrWFZVIiL8fQGadxatsQO1/zJh7/\nkz8gJiMfQrG7SiZWWCEhp23cTKHwGBSCeWFjG8FCfYl2HNHKcpQmNjAynGfu2EmspRZbtmzj9pkl\nFkzCBbURyq7DVBxzdq7AnnadgU0bsUKbbbbFv333q+zespnStIcZtKkdn0H5klgqrNRjBsVf3noH\nv/GH76D9+yOUt5+H47isfvMODgxOceH5Z9OcOkX/6AiBBbLsMn18mnJiMT46ysz0LBvKPptGKgwO\nFlk/3octLHAkgVYYo7DjlNnDm9nwkh/AO+EXvtPhTz9aQfd2zAhj1m6/emO7olf7R5he4tiuAJVQ\nKxQwhQH6h/qZmZunPDJCuX89qjjA6375rXz9l65icKiAYAGtbQ7sf5xPnfo05z/rfK656sU8+5yz\nuPP2f6LdlRRyGqvSkyrlK0VUmODYsHj0hyzUL+a2fR32PHI3z7tyE4f33c9YzbBlwxh9tYRqMU+x\nehaEB7CYJ2wdgs5DqOi/hm/9b7vyeQdHWkgh8f0eYb2Yy+M5AmlBra9GX61GX6nUu8UJmwRBB60k\nBgtDjOM5FLwc62QRd2KcR4IDRN0mw9UBhKc5Y9d0zyRzD+z7xmUM7jlJf+ZTnQ9ZLAiqH3oX4bv+\nnAXRazKc/MVruevb/8ymzhImtuD2H1Mn48CDj+EahcAiVhqBpGsU2nURmcS2DC3HB7qMJxFtITly\nw1eIPA8ravHdQ4/0nPWzdU4rDrBv9gSbnnUWizPzSHySOELpGM/pwykUUHGG6ctz/PATnA5Uml0E\nHpt2nMlh6dBZXqI1nGfnEyv0n3Maz3vVK8he9hwK0iZd7RDakLoh559+NjMnjjF8+mmEbahNjnFy\naZpIGIqeRRx3cWsehbEa60aG8T1v7TWxSI3BsgTdOEDaATOHNhP/dgmv22Z81nD5FRG33Zp7qvVF\nCvGfyCu9krJF3vco5z0EitAIGq0u+YEBdl50IZdcfBn1dsC1174UaZoIoNtp9FwpvkOsEr5zy37+\n7KN/jAigPADSydNszZOqPK4vcS0b4VtoQobcPHfd9FUuf9mvoOoVfnTbzbi5FvmcSzMcwE/WMVp5\nPsb2SDmCVBF5P+mNYj/9Xdj/XbIIISzgQWDaGHONEKIG3ACsZ411bIxprD32PcBbAAW8wxjz/Z/1\nnKfv2EDO8ZBC0FfuY2R4GBtJLmcTRF2SNCWOIlLVRqcaKST5fIF2q4u0HMrFPvqG+injMVwYYP/c\nPF6hQjEVGCM5Y/dh3Nsz2Azphs24PzBMtDOSkoNJBF07ov9lV7Dy+3/BrJXibDsP59Wv5znbt3D8\nAx8gsjrk0xbW3bcznKWElkVmDIkQJMLQFab357PglHJ57oc/Qa4k+d47f5NSmrBJJ4RRykkRUVpq\no0SG6itxVCf0j40wNrGOMIyZXmxQWK2z/cwddFdTAroUjGL24GGqnsuWSLBwYpptV7+YRx/cw3LW\nIt8V/P4HP0LnLX/KqUML9BdGkG2B6Ze4gwU6d92LWzDEJmVo6y5sZVN/8CHm2iuISoHOUoskMQzY\nCWdunqCW93pS1CdJKVojpcYy4Eh6TpVIMXvkcjb+2rfhr+D1r+vyw9vya6O/a1Biftpk/BPqSqYS\nBquDYDQnZ47jWYbS8BiWneeOu2/n5m9+FYc6lX6HnJ+AUUBIppZxrDyVwRxbzxzj0bumaC1EFEoS\nkQnm64ZKKaDoxfhWjkzHqBbsP3A3UQi7z9uIO76TfDVkyxmb2bzlbA4dnCXVXRw9gbC3oLMWSkmM\nSMA8vYD1/3ZleSc92Hdp7eMnNXmfEEL8f2sf/+F/0OSNA7cIIbatOV7+3XXFZReS83NUCkWSbohj\nO2RRbxNfKlVYmJ+nPFClE7R7/UEqQWubocEq1UqNThgStzV1O+TA3D5sy6V/aIhSrY+rnncNhdIb\n4PXAG2B137MZOrlEoW+A0tvfxt6PXY/fjEm/+F0yFTKZeRxZmqJ0/BDf/8xn6VcBUls0HEMhVcSu\ni6s0mTBg9TqhlbbwNfhaIxxwYkEUBPhKYoQgERmWkPSnDi3VYcPYOsS6IWw/R7bSYP7AETLbMN2s\n85pzz2bq6CE6VpnVLMXr1JlQgvV+gbaJcLTL/fv2MJbPsTTbxqmW+ZeP/jmsHGIsXyXYNUDwtW+Q\nX15FbiwwFCrC87bglArUn5jjy5/8NGF7mU7YQXlF2o7DShqxtLTC2ZPrsDKrJw1eG881gFEpQoIt\nHTKVIZKAYw8+m41v/jacBs/7s5DBwYzFRav3eL1WqIF/t48RQqCEYnJ0lDRJ2HT2dpZu/i7pvIUw\nmhMLB7Bli2LOUKpUkCLAdlxsKYmCgE6UsXXXKNlgl/7xKjPTC8wvBpRyeY5NpUxu8AisAGk0ibJo\nHK3TaLss3XuYuUaDSy4eoeb1sXX9mzGs47TNYKwMhI/nPwvhWyBaqGSK7qkf/deSRQgxAbwI+Cjw\n7rVPvxS4fO39vwduX0uYpzR5wHEhxJOavPv+4/NaacyWrVsBWNYZURwRpSH1ehsvl2dwZATbkvT3\n10jT3oFX0S8yvmk7AxPjeMUixijiuIm0XCxLIoSNNiDFCrJ1J9wMfA6OvLFOv0yYKfZx7iuvYOy6\nL9JqzLP6J39BN2nhSJfJ6Xke/e3fpKI8IpknsRMKCWAcZJLR9B2SQoW+5QUaJkPnJJYR5FKBkTEP\nfuhdGNPTkdqZC3aCnRlarqCT81kpFRgvVeksLNAMI+aNwi33s2PrWSzFGUtpQpbTLLUCNmmXmmUz\n165TTj2Un1Ku2RR2n8YF2SYOf/1mOkvz4MEDVhvrIx9n4t4nmOwK7GqF5qZh4hNzPLDvi5xwYTFN\nCJOEtrbwM4mJY2QGTbvCgyurXDpSxZEW2ih61GCDZTtIYZMqQ+bE2EFEc2k9db2T2nP3YX8VXvWa\nNp/7VKEHS5TiJxT6njj4qRWm6uUpe732+QuvvIr8x/+KPiHpNGYp2A7dJKFv1Me2mmjLxbFsfCnR\nlkcjjUnCBGzJ+MZRwlQzuxCgIk1Xw+xJged4SEtxarqFlh6lvM368SpPHDnOwalj/Ol7ryWNjuO4\nQ2ROiiVyxGmEYymU8jGyimtvohV/47+WLMB1wO8DPz2g/HSavJ9OjJ+rybv06pdTrfYjpMMWDDpT\nSEsCFsZIoiTD931AI4QFxhAlIb7noZQiRQM2fq4fozUIgdKqt9EUX0X8m4HnwGpjA7s/+Bfs+7V3\nUG22ue0lr+PMboe6ZVAqIXNtdBhhtEtKTOS1KcYWtja0fchkhm1yXPnFv2M6CXjwf72H6MQMlu61\nqKdSooRHpjW2MEgt0ZZD15JYKiU2GY6StBotphcXGRodopmGzDfbEIY0ClVOBgHNMMFym2yrDeLU\nV4myGKevyJJlMXn2mSw6kpxnk9mSTS96Lvfc+A28NCNSCfff8SOOmIxdfplzRY25Y0t8tztPPQcd\n49ARCt8SKCPJl4poW+CkkGSGvVMn2TXQhxOpXtVRZtiy54xRqUIIuzd8lKWEYZupvZdR+/V98D54\n3b8GfP66PKx1GSu1dnK/trA8qeDuq1SIAoNXGyezinzmy3/Lde/9ADnXx3ZqTAwOkVgxCysLlKsK\n6UIqE4Slybop0ydOUi4NkmkDUrJ13TjLx0+Rz1yC1ZDFqE7/UJV8qUIYLzMxPI7VbnLu6Wey79BR\nbr3/Mc68+GwWTn6BoclfJss6+DojMA0c0WLvvf/AXV/bg5M/9v+eLEKIlwCLxphHhBBX/KzH/L9q\n8q67/ouwhuO8/PJLueKKy8gEvZZ9KZE5iwSDpdewHoDj5siMRkvryR8QZQxGr7nahejpncU/wpeA\n34ajN/Yx/jKodPX/Ye7NozQrq7Pv3z2c4Rlqrq6qnmgaaJo5IiAgqCiKJhFwJBLBKdEvaoyvSYzR\nJCZ+MTHRGGMSTRRNnGJUjCIOhDgBAiKCIFNPdDc9VHfXXM94hnt6/zjVRRvf8A3v9y1yatWqWuup\n9aynztn7vvd97WtfF4ko2HqwQyGqpW857xNqMUFr8BYhA5n1HFI1UgvjFlKbEXk4/MjDjB6/md7c\nAomKqA00KNodnDcY69FBUBAQUtNtpjztpZfznX/5AsEUiDIwPjmJ6fc52GrjvSGSAUzBqWedhgua\ng488iLUdNtQHydtLLMeBjZvWM7FhEtWosTZtYlxJIlK6Y8M85fm/yO0330QRPL0o5cgAtKXmS3Ia\nsXYSVXhSK/G+pCkl0nu6ulLWF1LiXDUn30Ny1/5pnn/iJmSQ5KUnSQRaCiKlKUuLR1EaS2R7zO4+\nF/PqBtF8j02LjoueUXDLbQlBitVzyn9W+lqYmaNcP8bY+o3EZoGhznY+8N7f4I//4L0MrzkTFyQZ\nhmg0ZnS8QVQX7Nj9ECb0EaJGe6Hg8GN7EWIYaSCEHms3bMA5GBke4tCBx4gTzVM3b2KptZfNayYY\nGRimvmETew7N89CjLf7onZ9BlJrf/j9HMPkppBPnIPJ5bv6Pe7n7nmVEYy1Cngo89F8G8v/VzvJ0\n4AohxC8BKTAohPgs/x/Y5L37T96JDFXZtGJXgxJVYnhYvd0/c9sDCCFRHN3mPQhLkFVDUgpBENsQ\n+x+EB8FdpohfOMbiJ99RmX+6QFxaZlKo5QN42cE4QxTAesuAkOw3ozz1Tz7E2jUp33z7b3B8mTFb\nDyz/1R/Rpc541mV/M2G4zCmVo1AeK/yKNI/AYjn9qsspTzoOMz6CnC+oO8/0gYMcdo5SOoaMYzhI\noppkefogPmqg8hJmDiHXCdacvJHjNm+AqFYxpnwgyksSC/0mDJuIxnGbOPsFv8SPt++grSRxUbKj\nqRlxglrf0Y0dhZPIICmdJciA81D082r1kmBCoECx/dA8505OMtGEQnmESJHS0IxiEqcJPlCWljLv\n0u8WHNrzLDb92rfgOnjlq3Juua0yLl3t6AuPCI+Ldqe1BgvtOdaeMIIrlijzNo3hQd71yS/zhpe+\nlMmRQcaHB9nUSHCmQbHsOH7sKZjIEg0Gtj/0CINJzK5HHmOoXoO0wAzWef2b3sbB/fv59Ec+hpvP\nqEVLtMwyZzzzqVAMMXe4pNXK6ZY9ZpYy/u2zPyQigbolDyn77/4scs+NvPsP38ncQsbEmtfwp+/9\ny/93yRJCeBfwrpUb8Szgd0MI1woh3s//pk1e8AKPw4tK/FmsVMsiCGQ4Zm36z0MEYfWzAQK/oqrO\nSvII/hX+BXgZzN21hoH5mMJ2EMbhCMyOpNR7lpbqI3SELh11r+hHEud6TCQJfjFjrw30ZY1CtrAu\nELmImshYUJAWObHoM2QcJTGlkLhgK5qsjtj++S9SBE9kLDYInFIMZW2G1kyxZ2mRupD0Mdgy0J5p\nMaRyTK+DHRpg4pQppgaHiKTGiRInE5youtxOhIqTpjQEWLtuHVv6HXbO7McrwaDz+EjQddVEpVAr\nu7StuFtBJTywfRqrLFZYgldI4enGmhu37+SVp59KPAS2FARZkgmBUFFlIYcEb/AW9j1yCZte9y04\nE17wZzljY00WFtTK46lm7j1uZT5fEkcxBw53aY6Nk5fQOOsy0HVmf3wHKsto4SnKDkpYBhsj1Aea\nTI6Mo6OEvJ3zlKlTUWmT0zefyb0/uRdHDqbg5q/9M1JEvPG3f4NaOsi9D9xP/8g4S61H2Hzcy9i+\n/w686TA5dRxzvVn27v8C1nYZn/gFJobPY8tF7+ekC69mcdf1xL6JWPNE2fD/vM9ytKT6C/43bfKO\nrkKSozXuile6rIw9jxrVBF8dEo9SKI55g4qAV+VMtesEj+SLVQl2HRz+0iSD/T5OWerWsFSTnPWl\nz3Pzr7+d9NF7UWqQnpKsufxylg/Nou+/m3XlMjOffDd7rCXQ4XBDMmBSjLL0UTixQskxAeElJtbI\nxhAuL9BFSS4CsrTEBIogKaSkr2NGXI5ZmGF9c5gDRYvhOAKf0C5yotEavbZn3cQESRwhQiCWAqck\nLjgK60ErnCmAgIwqhElLzemnnMbs8jKdMqtUHL0nrkUYU1LkhqLICXiSGiSpoTRtjFUrGgiVOn0A\n9gXD3l6X0/UgRc2TeIdwGr2i0KJDoMgyfNmnv7CBBbWFsYt3Ed8Av/KKjI9+pLkSIX7F816uBox1\nKYhlln96J3P9wMnnXcjc0iEGhxVRo4nWAq1jhLQYk2GzwGy/xUC9RqM5VPXhGpp+abngrK1Y7yht\nia8FHn3sMW777jdYu34TLhpgcPxUityxsPAgFz39NO67425O2ByxefJsthx/LUE4ZIhB9OgU1xMp\nQXPThczf9Ql2zbzrCYP///bwVwjh1hDCFSu/L4YQnhtCODmEcNnRHsvKa38eQjgphHBKCOHm/+r9\njk7agSCIFZuFY0Sej1ImpBCrB8Wf+0wrXeKjrwh+gLjvMBQQzh+hf/sQpYT89JPYNzJICIHOw7sZ\nHGpia9AsPSapMf7rr2bDG1/PdJBkCGI3z8m+zZSI6OWKpWDIQ6AlPRnVlF4QilJpFuqKs990NZe+\n8VU4GWh4h4ksTnqKWJGPNrnwtb9FW2mU9ejCMd5o0Mgtg3FMogRREmEoSSJweYYS1eBbRa53JMoh\nbI60OcIaKAuULQnWEHvBM84+lwiJsa4Skisd3gvKwgGCej3FeUucpIAk+EqgbrUfskKV/4/de1ku\nApkrscbgXKU6HGuNDgZRFNh+D1eW7N95Kbwe+Di88pq80uXyP9shECuNytbSLBNr1zC+ZpDNgwVq\nz+2w64fsveVbJDJgCUidEsWDiLSOV5I4VvhQYEyLxdm9tGYepeY7rBuMScoeW6Ym2dhYz8Vbz+eZ\np1/A2nSAzoFplOyj5STbHtjNnm17ed5zzkUHy1AzJbddnAyg+1hfZzC9lJj1JLVnsv7pn+bci89/\nwhx48mzylKbaEsTKlySEo8DlysbiHcsLs1VTDF8hXSt/EYJfTSARqsRB3lS1Sn8Vlg89gyIvWZaB\nWaUY/dWXgg9se+cfET/6MD6pgXCMO+jcch8Ld/+UAZ1QINFlnTQkjHlFGg2wGMVk3lD4sgowrcii\nagJvvOW58wMf5TsfuY4RPFqVrLGeUadRmSFxkm4ySC40Qcb4ImO4U7AmrrO0vEgt0RyaXWZszQS1\nhq501JzFO4eUoKRHC9CiOtM5W6IAb0vwDm8tI81BLnveZXgEJji6ec5iu0Wa1kjTlKIoCE7hbI04\nHsB78Ct6Z1CVtHXjmdeS7+zYhssK+rakdAbnDd5adAhECvrdNkWRs3/7OfSfqeAAnJQ7LriwXE2+\nYzljEjCUnHLmVvJ+CxkN0nGa0cRxVv0A9RRKZyidr+y6Q4yXMVZEmKApnaPWSJAxtJfnmD+4j9gV\nlK0FNowqtmwYYOOYYMMQPOuUrWwcg9NPPo0Lz/llHrxnP/W6prXc5kc/uhNnDKwYu1rdJzcZOjqL\nXniYdrwDqV7+hDH7JBIpf9ZTIyAIweFWSi4VLIuzM9xy0w285OqrITTwMuLo4lWWJVrr1fJMCgmh\nAwvA02HpkYjas59DsmGc+LobqY+N4DOJ0F1y6/FrRnDtjKbrMvdXf8GeSNAUOUYFSl2iXIJUlot/\n4+0sbD6dW9/7dmTrAJiCIpnCtJeJpYUoYa0z9GKB94JaiIkQGA9OO3S/w08/9VdMFMuI+gReZaQG\nyqxkLG3Q77d55vv+gdZHPsxxUzkxMVpWtPcQFML6CryQCSCvuO0AACAASURBVIeOHGZszTq8s0gR\ngzU471nuOnRjgOc+8xKuv+FLDAw3SWoxHdOj2+2ihKz6INkS/byNFArrBR6LFCuwrwhEPmFfmTHj\nQdlAMCVSBGppitYNlM8QLsE5QxSGOLD/Yra+7la4Dq69NuOHdw4B1Wdf7eRLSVyLeP6LnkcqHQfv\n+BZl1qZ0msx5hockS32F9xrvq3EHW0qcECRxjC8dSgQSJ1FSEtUEcaQg9Oi25smyLt1un7VT69my\ndgyphiicpd10XPGCC5BJyjlPfS4uRHzsQ+9hw5bNnHHmczjxpLNQKsLSQZtHqXMBIVr+z0H6M9eT\nJwy+oiPlXUnp+hSmQ6AEn+N9h26+TDSQcMVV19IvqrKnEoWurjRNV1VE4OdLgGAte+/byykv+UXG\nSof5zu30QgFBM2M0m699Nd0NkwgbkK6HFjlRz2CsYLSQpNYQO0jUEG1TRwymRArqT30BJ/3uJxl4\n5gtxSUlfl7RijXcZEYFDMsKccx6HYoh0IC5zxlqLhGiAkWe9iMlfek1lDBo0o7ljXU/x2Meuo/XY\nNIfnPEkIDChPFDlsWSJCgRYOjWVqYpRUeoQrkL5EmJzWwizSGVzeZzhtcNWVLwcPeeno9Ht4AYW3\noARSQppEeBxBhsoHxXucc5UEt3Asy8D1991Dz8aYosB5j0HiQ1mJ55UlRaeNcJ7De66EXwP+BX75\n0pyhIfdz5kJCCPr5EhPHHc+3bv4x+5YkgxObEMYwIhW/cNwWQtHC+xJrI4wXFDZQWEfuBIWLyK0m\nKxWFVTgVU3qJC4q83yJNYGRY4/0SCzOPMLPvXub33MuQ63Dmho2cNjnJsHckvQV+Ye0a1gfJ9I+/\nywN33EBnYS++M0O5sI3l5d/Eq799wph90nYWa0q0UhhnSNIaBIezDilAywQZJ0gZ4xHEVGhBCAEl\nBXgHwVceHiLgVx76MbnEkcemOaE3wj1XvYlxHSikJESaUnu0kozXFAdnFuirgJHQD5Y0lewZrGHq\nkwwcmWfcttjzjx/gYBwRFQeIckl+6BDrwhxz2QKFFehYI6wlCQlW1Xj2G99CecpWTPI5Dt/9PcZU\ngg+Gjgo85ZynsPfIQUqlMLYkEoG4dKQ/vRPfnOLu6R5H9k1z6cWnoWyPlDpSWrwUWCASkuAMkYTC\nFYSgGW7UKy5VCGA8wsLTL3gGN936bcpuHyEVaa2GRmCDJY4jQtaHlZWfFSNT6yXCWQSeUtX5wcGD\nXLJ5HNnPQDVQqalqAVOSZx3yPMfOnMDD7ZTTz8tJvwlXXVXw8Y/XObbtpgRMTEwxOH4CL3zru/HC\n8aPvfpXa6Inc8C9f4MHdOyiTlFGRE7ynpIYWVWIb7/BKIEVYkVWCVCUYbwlSIVwNjACh0FITvKdR\nh9qopx4voVxJq9snCMXQ4AQjw+PUk5h+4QnlQQ7d+whZXsflgXr6POrr1z1hzD5pyRKntWrUOkoI\nSJSKK3gSqFisAAJx9FwCKxYDHoGugkNKfKi6xgi3QsCrLt9uMVoOUNAncmBljSPasyEPLCrDvR/4\nGJPeIXxgSUTUvaVfwvP/6IOMnbSBH1/1WgZ8TCObZ72RGFKEDjSWDvDIB96KUI56OkgeSoxyiNLT\nEyU/ufFGzmpezX0/vp+ai3FkFY08SH704T8lK3NkyJCDTUK/xUgW6DQHqU2toT0zy+YXX8P9t36e\nU886jsEow5HinCQWiuAq+Pgochi8QyIxxiKURWrPYKPBUG2UZ5//LO68/3ZcWRBr8MGRGwhaIpSG\n0iNX4HqUxLmA9xCcxBG4f3qGLVMjnFhTGNdHhwaOgAqBst3D9TpEjSFm91/O6a+/Hj4M13ysz8ev\nSxEBrAgkUUykE6580UuBOhZDb3Y35553Nh/9m09z535B6TRDUbXg+EgjsTgnKqlXE1BHnbwEGO+J\nVkLWeIGznkaUEKuqrLTSUyKYGEg57aknIActXjWJYkGIm+RFncUjOf25BdJ+zuRQTDpWx5mYnuvS\nb+19wph98s4sXlBlhEBwFKMPKz9BSH5mS19FwyrYjFXG3wpu7L1DycdXtLqn4j17hUkU/VjzvJtv\n4paLX0RCB2XaRC7BOUEkqw58L4XlXbvYqA2p6+M8IALGeVIEpRJ4V6A9KCGoWUkU6vS1xOuCyHn8\n7AFu+9CfEAmPaGhE25LLhGBBlMtESnP2L7+cPTt2sLh7J6U0xEVBSs543mV+736yg/P8yAnOPusk\nBqN8xTErBiRaBlxwRFJWHjfS4VxJMJXyhyKBQnHWCacw1Ez4/h23UGLI84zZuUUagwM4F5Ci0v/1\nPlT3PayMA7Oy40Qxt/9kO+ue8TSSLKPQEVJpcCWUfYp2m8aEwS6/mN4vX0/jzXAKjnPPM9x/d1Q9\nTe+IhONFL3kRij4Ly/tJyjZf/ddv8aPv/wiXO8yKZQQm4IQH4VZK6srCQqLwyhFFGiElZelQClQQ\nxEpQ5DkiicBBVIOyVMzMlHTv3MHFr30lMI4XAwQSkjRm7ZBi/ckWIxTKZ7SWHmV5Zju9QxmpfmJ1\nlyftzCJWKOH+GOREHPNF+FnaxLF+57BKCK9eDB6t9Sq2D9CQCYU29FUdeemlBCxtZ5izFucdDouy\nJfs2TJGjSBy4vMu2T36cW972O8i8pMCSe4cVgSAMCY7gLYUtSXPDsipZe83lHP/SFxHSJi54+iaj\nbwKLRJzziy+nbK6jiBOssHjtQHjuuukrdPc+TDOUlHGlHbz86KMMm5xHf/gdRKmY6UTc+dAsyyZQ\nGIOjj9UBoUFr0Fhq0hMFjyYgrSEUJa6fkbeWsP0eaxtreM75z2buyBK9XsnGteuoRwl4hxABZ+1K\nwlTnFufc6u82GI5IwcOHZujmhl6ZQ3DEOoDLyTuLtBeWMD3Nt787Dq8BPgGvenUOAZSSxNIRectQ\nbOl39zE6rFH1IT712Rt4bHaRnitZai1RWE/hLA6FtY+jdForut1KrPvoiLJzFmMMnXaHblYSVERh\nLL2ixAZJWQ12Yk3E9z5xHfvvvYfgIqRXICVORvS1RsoIEY0wMn4Rx5/+KrZe9mY2P+fNTxizT54w\nuPccyyg7Ohfxn2fVxH/1fbTmPmqfFkD4GBIgAxHnFElgaXycyVddieoV3P7Cq9niPO1I4kMDUdZ4\n2kfeR/q0cwgSyijg8y7WBhyuquxUoBMKlNZgS3ywZMqzLEtCvUY8ciLN0fWUviADprHMa9gwPMYd\nN36Nvuugk4gsBCwCgSFxIF1EdMbT0Ou3EMka0gmcdNSdISCIGwMcmp1hm3F0kh6FdwQsedFChoKi\n38G7nAiLpkS4qgejnacmFNpDCJLhxgiv+5XXMFYfrnYPHxio1SFQjWmzQhvimAUpQOwCRqbctWcv\nixI63RylFVpWvZ8y72LzHtoLQvlr8OvAZ+CKy3IGh6rNP5Fw4dPOYnA4oTkwgDU1vvaV23BiiChp\nMru8hIwSkBFeKKw1j3vaG4MpDbVaSlmWGGMoyxLrKucxKQTWO3r9PqUJCCKWWzml8eQux7kO2mpm\nHryFH37tPYTsbpYOfg8lWyRBVvGCwdPCy4Jg76bf2/GEMfukJUtpLU6olZFUS8ACjnBMBlXJ4P6X\n33JlyzamqISkg8LLdRUb7SAUawPpK36VqdkO27/wWWIbGF1awqqchvOk0tAdEBz68rcQZ2wEa0l8\njMXRU1WZcyiRHDYrugD9Ng2pcCGhH2IWVZ/ZXpdb//79PPiJDzFdenY3a7RdD20dxewROkVg8+Wv\nY+iplxILhQsBR0YaPP26ZO3EBtLjz6Q7OkkkDE0j8D6irSTlkT1s3JjxOx+6gSv+5N/xp84R8iWk\nTBDB0WzUSLUkFZBKSSRBi4IQ+pSmR6+9RDCGWGpGGsNcdfkrOOvU8xgaWUuZO+QxHfyjDcrVRnCk\n8UKhnKErE7656yDL0lNkpjo7iYCyhmxxHuM9DXERD+cRnAn1/4CXvbhy+jpl61Ze9KxLeNMb3sbn\nrvsqe3ce4KtfvYGFwjLXapHqQawRzMwuIgSk0qAUaC1BeKwr6ec9jLU4X32+PMuQUYyLEqxxFEWB\ndwaPwyEpbUYaR1DGRKWkVThGByboTi+wuOMmRChwIcKJNt52mN/1ddqPvB0Z5mjUz3vCmH3yFCmV\nxhMIODh6ZhH83M6yinD5FfWQ1ZGJsOoLcpQO48J61Ebg32Hi2X30iWcy99EbiL7+Y/KywCeBttQY\nMUh50lqSHTspv/hNcg3DUtMhsOxLEg/LQfKLH/wQ0zPLfP397wQfcZJzNITgUKwZLCdxIaNruyzU\njuPiV72VH379MyTOkeAZCIGmlOz/5udpGk0tQF80GTCQa8Fw23L33bdxwe/9JfK4k1j40j8ymVta\nPsNog7RwwZVtoo96+Bd4wV238bE/Oo2JpVMYqkmIPUomFQfLGhIZY7yrziO6ujeu6OIKjfZ16nHM\n2VvPoVeAqA3ywP33IGxB8AaER+t4tZnonVthGwWEh4VWl5nMsTZ1yAgiFaoOf96j010iUoY7vn8K\np7/hQbgOrv6bnE98ocHczEEmJwf4x898AmRC0DUem54j6xmU1JSlARkohODw3BJrJ0aQvsR7T5Ik\nCCEpy7J63iGgI4XWEcZVn7ORJCAcYoUdja7g5UMzCwzV6ySpQphAtm0v/bmcVJVsu/3TNDWoYi+t\nOcVwMsz4xBBh/8P4sf+uZxbh8C6rGKrHbP//BZUMWHnNh1UhhMe7xAo8eD9VcZ73Q32yx+zr34MV\nOfOJIYhAWnoy5UjPP5OnXPdBZrRHmB5J0cVqh1aCRMfkHrpCsX96hsXlZfJeQRwZChdobLmQX3r7\n33Py8y8lTSXr6g3G8oLpO7/L8NJ+xkwH5UuKIImCZKzISESfXr3J2Vf/GoeGJglIEikYznPu/4cP\nsvS1zzMiOiw0crrDY7jmKF5q1p2UwzYgB14NL3nHLn7c2c5cUHSzAmNyZDDUJdSlJw4lSShJvEH5\nAm0M2uaErEXo9Wn4wGVPuxjtUtatOwGtqqau0o8r4Ve9q0oWsnIADpRectO9D3G4n+GFI5aeRAW0\nK8g68wzENUYbb6R/GfAAnF4znHt2oJc5/uzDH+GlL3w5V152BRecdT4Cydj42OrCF4KgtIF9R+Z5\ndHqezEA3d5igyFdEx4+eo8rCYIzHmoB30MtLSuvp5jkmCKwTlFZAPEIvpPRcgqOOMSn7pjtMHxhg\n70/3sf/B3czsHcX0G2SyZMl3KGaWKPbf84Qx++SdWYocH/xqnXq0BPi5yRi/oicmWeWQueB/BiU7\nKiPqwiQcBxyAeKqgX3Yx2jLSLXGRppfGpM7T/uG99B7ZQZAphZBEPiJSEdJ6tE4w3lMGz51/8z62\nffSvacqU0tbxUY7ccyePfuAdZDd9l5N6jjELIyHH7byPQWcpVQ2HpYwjBo/fihEpCBgwdY4c6rEc\nDeAkdCNHXJQ0Dz9Kku/FWs+sm2TtL1zK6NZfwBEzvqmAncA/AzOw5s8N77puGzOj32WuZyiMxRYZ\neIv2jnRFyiMGQlkgTYGwJb4ssLZHv7tEmmdc9ZxLWTs4jBB6hdPl0BriWGJdUckfscIbo0LOimSA\nn+zcRTfPUaIyQ5V4iqyDz0tsVuc7t47AtcAn4JpfzbAMsGvB8dhhw959S2gTkYiYfq+PlLICZaQk\nKy259Sy0ejyybRuHjxxhZnaOol9gjFlpF1TPvfSOfplXgEAQlMbR6WVV38cGylBZj+eloMwdvTzQ\nKjyLbolFkTNrLAv9Qdr5Ip2iQ3c5Z+GQ5fChDnM7mk8Ys+KJVvL/vy4hRMh70ygdgY6BqOKGHaUQ\n+1ChZd4hpF3l8xCgKAJJnCBXxmAhHCUo43xGYsdhEHwbHjz7abQLqAXDaZ/7Z775smsYSev4boaS\nGhMMTq7M1Ecpl3zqk3zxbz9I++47sdbR8AKpFDOhoBGnjLpAI4soZIYEpIjII8lyOsRcFJF02+hQ\n0JKgGaY9MszWZpPOgR3ENqYbFZQ+oRkyclnDiYi6yKqaWziyqS3sPTJN2jCMj57A+779NWobA+yp\n/nd+HdgLfA5uO9Bg22cuYm1tjDi2NOImJiikjil8ivEKg8cJjRMRDomOUmSthk1iZHOAz97wcYam\nHD1XUBQ5InMIYmaWWgRbYzFr02llaBJUKJFecs3TnsK5Y0OUKiUPCaY+SHPd8aTDY4Tmj3nJBX8H\nz4b+Nnjb76zjB7cOYKzhhZdcTK/d4Z4HH0bFEUVezdV0ez2EEKsaY85XzA5JoJFqxkeGGRoeRquq\nuSo1RLFGa0kkI6QMKOGItSJVGqEEkZZIGYi0ItIVjK2URAqJVFUPbyjVxFIhRcnAICgliOKY86/5\nGOFYqsgx15O2s8zOziGkXgFoLB5LCLbaOYTCBvBCAzH4BEGClCmRjlfgME81cvz4ewZZI8RrYATk\nAvg1sOlNryPzDQ7ML1KunYKxEWqNGK9yrDAoB3hHw2QsJIJzrrwW7SOUUhRS0goOqySdAuRJZ4A2\nlUecrHoTMkhqTnPWhefTq2liL5hE0xQdxjqH6B/cR4SopjCTUQakwQdJb2w9Z73lD+lNbSaIQDA5\ny9MPoOw0tUXBpqufRy0LoKCjJa950yYOfEjDW4BnwzN39HjJX3+HAyMPo/IaPstQrgRbokOJoo80\nOcLkSJMTyoyy7GP6bcj6xH3LSy57FrE+wsBQB9/r0IgsQ4OG44+XrJ0oOeOUMTZOjaCDQwaBjFJu\nf/Bh8hVhCuEtLuuStRbxRYHtnMNulcLFUL8CPvbHh3jHHxxmsC5pdzs8uH0bXkC/LAhC0Ov3iaKI\nWq22evZ0SGyoyrN+YTg0M8vOR3ez98BBZucX6PcLvAPnfGVuVJZYRzVtWRoK6yiso/RUkLS1eKEo\nbWBlnyQ3hqXCcahtmO0F5nvQKQRzrf+mzl+j4xN4H1ZYtmXVWPOeYHJUyMB2kaG/oqPrCMEQgkHp\nCgk7Cnce27jUXhDYuHpu2fDmZzM7OMDcRMz233w3dmkRWv3K4sFJtG5wYGCQw40Bgvd8/zW/yQ/e\n/Q5GraNpFDIofBAIJ+jXmlz8tt9nSeUUSmHlSvPUOkbaSyz/+3+wrmeQKiYJEe60p5OPnIJQntIJ\n2jrw8j/7O4paExEbWiFgJk4i101K7+gJhfCelJgo8Zj9N1Ul2Mnw2B7NIw82eeWVm/nqwFClcvAl\nWHOt581vewD33G8wryxlUSJMibAZyuckBBoChCmRtiT4HFP2EEWPorXEiNrAGeuew9p0HWdvnWKk\nZhgdMEytq7Nu0wBDU5rhScHoZI0gJKX1HMozuissArxFh5L+8jw2z4hVzC1fv4a5v1Pwy8C58Op6\nm8//2z5qtWm6vS5BgXWOxXaLpF5DR/pnmMpeSDxQumrcAKHwQdLPCg4cnGbnjkfZvfsx5mYXmV1c\nIi8t/dKS5SWg8DZgbAV0GOvIS0dpHc5BURqK0gCSMoAB8lLSzSULLctC2/18oB5zPWnJUm8MIGSE\nFDGRisFJYl3JtU7ve5S8u8yRg/uRAR7d8QiP7d7O9L5H6MweojU/gwwlAksQuvLkQOIRiLBu9dyy\nON7hkU9/mXSpg6bFSCdH9rrEQZMHzaJQXHrd33PZB9/LfKo4eW6OM1wLEXsioOGqbTvyNQZVxI1/\n+xGkUQhjCS4gjcF5SzfqoWSfIRyJDzgvOO0FL+WS33s3eRojKOnieWjXNjrNJjVTY9P8NLv+4JU0\n9t2HDw4fYrpRQl2MUtBhVD7wM8linKXTV7znD9fwtg+uY+kGWdnXnQMviBY593dvRK5v07MlgoI4\nVkSyJIiSJJFE2hEJMNaT5Tm9/jTdzjKbJk9jTdxA2GnGRwT4DOksPgYVQzIQGJ2KccpQSyJU2uDA\n0jIaQyQdrrQkwdKbn626+upiXnb5iXz7rBRuAt4NWz9s+OP338sJp2k63S69TpexoWGkkDjrMMas\nCPLpqpUQBEFKclfRVywCKSO0jFBC0llus2vXbvbs3MvskXnmZheZm2+x0O2z3MsojKOwBlNCXhh6\n/ZzMlPStIbOGvCwpihLvLQ5DXljml5ZZauVPGLNPWrLMz04jlCKzFpQmSmtktqD0MLXpBBrDk0yt\n30zR77F161ZO3HwSa9etZ3jNGIOjQ3hf8Ya0sOjgCWWBF6ZKlo3AAVh+8DuMdOdRhaHmFC7ROCXQ\nMjDeLxhE4B7ax4BQRDn0lSB3NRKv0BHkHmxskDVLVPao7bwfQUIIgdzXUM+4gvCU8xG+TigVuXcI\nURC7Ftv/+gM8cv2nSJY7lBgS69n24T+lcWiGZW0J0oDrE2yOiQJSw7B1uLTNmJFMbl4+JllUxXSg\nEt6+5bsDvPzFm/nBxXX4BvBu2PSekme+5VsMPfdBunmG6mVEwRJ5gxaGWElioZEiqpAv4zBZTihj\nTlh/AdqP02ykDA41sGWEcYYyD2jqZJ2SxkANZwps6bn9oUeQOsYIh04iQplRdGax/QVEUfDrr3w7\nf/L203n3V8bI7wB2wpp3OT7+iX2ctDkwNTYFViKdRMoYrWOUinDOY8sCLRVSaKTU9AtPWTpc4ZBO\nYKzHWodCkaiIxcML7Ny2mx07H2P/7Cyzy232H5plablHbh3ISoPAh0CRW3qZoZ3n9EuPC5AVJf3c\nAAnWP7Ek5ZOWLHfdfhvelWghKlQsFGgNwXm0rEPQSBljVY+y7IOMUWIUL2KQKagU6z22u8hj2+9j\nz8P3ctdNN7J3e2c1WYaHW4xmBQUFhQokNjAc6pQXncFCcwBve/z0L/+cf3/z21DSYZQneEOEY+yl\nL+fcv/0H6ideTFRExGVJUhSkzqHw+M2bUa97E8f9xm8ziwIFWkg0GhtbRuws0W3fJteBOhp8TqYN\nevMUzbE15MHgI49TKbVzriB+2rNYps6pl13DjBxg6gT7eLLsVlhbIYdQweZzs5K3vGE9f/aNCbI7\ngAGIzoOL1j7M6b/1bYqBNt5DEiQ4g5clrsiIhUOHkiR4RL9LNr8MZg2nb/1VsnyAfmYwXuJ6jn63\noN8p0SiU9djgsSqwP+8xayFNUrQ0aOVRrqQzO0O5vETkHG96/RsQvdfz+jdtpP9FYB9M/mHgc184\nwuT6BbwoCKJESotUHucLStOnVqsdM0ZejRsHFO0yp+stJQKPQCqNMxYlJQONJlpKDh+Y5uGHtrNj\nzwF2H5zl4OHDHDh0hE5WkJlAbn1F2pU1lI7JjMUJsNaSGUvu/5tqHf/ilVdx5MB+ptZtQEgN3tPr\n9wgeTDZPWZbMzc7QbS3TWm7RWm4TEHR782w9cQuHDhyhHiU0BxN6vYypyUkGdMTiUszmjcCdEF2t\n8Eaircasm2BP2WFiJuGc33kr3+z9LfrHP0KZNjUgl4GARIqAQFNDcrg+wPFPvZAdOx4iKsDhESoQ\npMXNH2bTwb08tuMBhiJNCBZFxQSOwhC92DJYT5h6/os48JUvkmAIzUHaWY/WzAKjSHJXELRg60XP\n44dfvx4noRwYxonAuhPN48nyTxHWWmIdHRNI1XX950f50R0N/vyvD3PGC3O4Gk64ps3kO2/kgRvP\nx953BoNC0g0WW3qcd/hgSXSE8h5R9jE2J6lv5ORNV3DrT7+Crxt8ENiyZHgkZjnr44xHK40VgSxJ\nuPnue3j5uWeTaEkZKjvuUGSUC/PkoiRKGmzZcDzrJ3+P173+vfzzl2apXRWY+IPApz53hNddu5Z9\nj8VVH2elLGo2GxTmKAfM4T0INE6CQeBNSeR1RT1yVDbf3qC1RktFTdbxdUFhYH66w769R4iTiNGx\nISZGh2jUE4YG6yRpQg2QUiCDxAZDUhvGOvu/DtaV60mDjr/6r59mbnoaIQVCa6QU1NI6A40mUkBZ\n5mitiZUmiqKKVawUWZlh85IkSjiwfx8iCpx5xll0O91q1lzdzYUDH4S3Qvfbx/Ptp0xiBpvYvODc\nv/h9Hnz7+9j06pew8P2dtB++HaWqxpt3gnltSb1kzHsWI0F83Ens3bMPKywj1pMisdKTWoGQ0BcR\nVkDkHXUXWIwFg0+9gOXlNsP79oD32E2nkx/cRm66aFfn0AlbOPHEk+je/GWWU40Ojo6qUTpDLkqk\nr9PWns8/+iDRCDAP5569lm6RkkYxnmoK0YpqvqUax3boCN7wxgV+7WULqDeyCjHvLzZz5MsX4LuK\nBZeQWYlXGiFiCutQcQ1fbyBrI+hmg1Z5mO07f8iOpe3UU8vQyDDO1+jOe+6/ZweFrYGKiETGbz/r\nEsaVINQSWt0CHQ/hvCavxUidkgyMMzK5lk7o0zPf5LWv/Qa1qwJsgtk/k7z22rU8+miFbMVxjPce\n48IqodP7ikEQhEdJhdAVqiK8QynJUBITnEVJgRZUqpoIZFwBM8FLcm8og0XYgMBTH0gZHh9gfLhB\nEmmSJCLSmrgek5eev/vEt/77QccH9u/DOF9J5yDxLiCloiwMIkjqtSbeVdBwnmdkWcby0hLdXlb5\nnofAxNRahgZHWVhYIiCwDrqtkdUDfpIscuumOhe8680UvR73/PGHeN5f/zmLH/sG3V13UUMQnAdb\nUURe9v73cfnn/5WlKGLCBezO3aS+Qt96kSMOAYEjiAwnbFVC4MhjaEdgTziRqZe8gme9+hpmTUab\nPotHdkEMJBE9rXnucy/hsTtuRujKqEk4Q2IycAXCCyJvOO/CKaIZYBzmOoqyr8CvqN8IsSrQ4YLH\nB0cAjIG//5txXvtbx7H/Q2oVYj7uB3s56/duYOCUI2gtacQaLcLKoFc16eD6BT5bImR9hpNJzjjx\nmTzvzKuZ0FuY3b/A/NI0rljmvLPOpBYLgilxKO7avou2sYggGailKzoBBSHv4LIueW+J2YP7acg6\nNfVL3PztN5FfL2EfTPyB558+c4hNmw1pmv6c10vFIq94YpGqVGa8W0FApcIGwVK3T2YcxUoT2fgC\nscImTwnUEQwozaBOqEW6Eg3MDdP7Z7j3Jzu540eP7TlVbQAAIABJREFUcM+9e/nptv3s2naIuSPd\nJ4zZJy1Zev2SflZSmoBSMfX6IDpKSWuNVRkkrTVFUVAUVSe3Vq8TRVElwOA9LnjS5gBxvUHpA4vt\nDrffsYcwIWARItHmxb9yJbd//AsUcWDN8iJFEmi0eyi3WA0UKYHzFkHgpu/ditdNfJogS0tifaWm\nEiwN48hLTT8aoqcblAGUEygEifWkDrIjyyTTs9z5uS8w6iMaXjORF3SNYKBrqWP5yT/9K+Omz8Fo\ngHJkEiUCIthKlEKDDxlRtK0qwbbAvj0KESByj0tCSSmPqj+trsLBV/4o9/8k5aorj+MrAwOrEHP6\n8ozTX/wdNp59kMFmjUSritoTxQTrkD4Q+n3M0mFCJ2egtp6x2lYuPONXOOXEZzA6OsaajeOMrhnj\n9FO24KXHB81dhw6TCYHJS5yxRMIS6cCghsjn+H4Lmy3TnjlCSszy4XP5zvf+B+WX1eNnmM/Pcdym\nx5+xtZXT8VG7CiEedxI7Vj8ZwMealje08pxWmVMISY7HBIFzHh8MmkAqBbVY0awnpFFMohLq9TFq\neghTCBaPZOzefYgHfvrflHXc6ffJjaEoShbmF5mfW+Dg/oMcPnyY+fk5+v0+nU4Hay1SViZHR29U\nkiS4ELDW0u31OXxkhvmFRW77we1cdPFzyMthmAKmoZhYZHHQMGkcOkm543+8k1a9S61IyIuCnrcU\nCnzHwLe+x9df/EImCoeJBFZXSpkiVN36iVddw1M/8EGmXvxKpNYYPNKDFZJSCga7Mzz08fcSdj6I\nCCXLEcyowPpLnk85NESeGBK/BFGdC97y+2y5/BUYIhSg8GgUkVasP7EL24GT4cCBlHqSVqWXMav3\n79g5d1iZ75EC8HRzyXv+cA3v+Kt1LN4g4RTgnTBx9qOoWBLHEi0VmAr6Dd4TCYfIWpj+YaLYkq6N\n6fuIC8+9nGZjE2KqwfjmDQw26tTrEcJp2mnC7vkZTOlABKwrwWSoUJBKhygzys4yi0f2sn/vLuJI\nUyw9nZtv/C3sV6KVHSbwqc8eZvOJlhACURQRRRFQaSUf7ewLIVBao5RaHS3IvQel8ElMLiRzfcN8\n35GLhK4LZNKShZLC5BVLwpbgPVrEDMeCsUbEcAojdUWt1qBWT54wZp+0ZGnUYnww5KbCwLv9jF4v\nY2FxmYNH5ti5Zx8HDi1w8NAcB6aPsNRq0+n3MCZnudWmVWR0jCPIlNtuv5O1U1OsnZpi74HH6PVG\nVhGxeq3FaZe/gPlaJS4+4CJkHOg1BshO3cTFX7iezklbURhUltEUknY/p8ATGQ0qo5QJtbpiYnCQ\nfGCIKBMQJEFblO0TbAkikKqUyEc0TWC5Zok7cO5bf5eNl11OS9Ro2ohuImn7wMz99/HIlz+LyQ3o\naiTWFgVeRJx6SQHfBy6CuekJ1kxNIIVDr+wi1jmcrxpociWJBBWSSBDIlV3o1lsavPPtk/BKYBvU\n1rSIhUE4cB6Cz2jGFm+zCl4lEHpd2o/tohlqDI6N0urEPPMpr2C4voXR8Y044ZhoNHGyD7ng/r0H\nmGv1yUxJ4XqVSDgOsDRqmlgaUm9o2ILlfY/Rmp2hzJ7Ov335/1jdYSbeFfj05xaJ6pKiKLC2Spyq\n/2QhBJz1lfJoMEhnKTGkIvA/mXvzKLvO6k77eYcz3aFmlYaSLMmyPMmzjDHGboNtDAYCGHAIhIRm\nSggJNGOToZt0EwKZSCBAAoGEpBMgNhAwZvSAwTbGs/EoS7ZGS6pSjXc+0zv0H+damHTi5Ov0t8xZ\nq9ZV3aqrqnvq7PPud+/ffn6JisEHCC1RUoMULHT7rGSG1qCoKDvGkReWwlQDb9oW9NOCQZbjTDUq\nHQmIZPCU1+zTZ5M3VJOW1iCo7hhplhIEwXA10YDBBWG12VcJa9ZOY8wAryylExx4/AAPPbSDKNSs\nm1lPYQ0XPve5+OBmOGYPPACrL76Vbz/QwG4/nt49u1k1KJl3CeN5xvyRHnVgw8aNDB5+pMr+vMU4\nRyADFgIAQb1MyYTggc98mv7XPk+00kNYSyAEljHaTjOhNSrv4kJBKQWhC2joiHv/4Uomt51ILjqU\nOOqZxsmM/g3XMsKAuZFRzj/nWdx903Xkq6eQ/SOc+IwUvgd8Cu74jVHGJyc4fHiWIjfYokRHlRzf\nW4tQ6mi6IqQcstWqqVNjPY/vqcHxwC6IJ5fx3qHwOFtirCWQEAnIC0ukBGk6wFnF0s4HCdesRwUx\nth8w0dvI3T++jlZnjjz3CC2RJsfHgtneYZwYo6YkLoIg0qhQYsqU2AtCZfFOkrdTirQFSUIy9lyu\n/lqPF135t9QugvW7POedn3PDd6oL1rlqL+kALw1BqBF4EqEINQhXYnJBGNQQtsS7AJTDe4WUgsIL\nSheQ9g0aTxgqNKBltfk3tuKxuUrCXs2xy6cuHT+NkD11lDRprMFaW6VarrI+ELISUSol2bBhhjiO\nWFiYZ2WlxdJSi69f/U3m5xf5r7/1W5x+xll8/otXoVTAD2+5GeWfD+8EPgBb8zvRo39H8+xn0Nu6\niYEvmX71a1nWgjX9gq+/4c20fvAjhJA4Ac4LNNCZXsXz//JPmZ8YoVBVNaVmu0zPHqKZtYASWVr6\nm4/hP33yo2x52xvpR5LQOlwQEXqFDRyj84fh+hsYHQzorV7Fka1baMUWp9oMAsuG57+Y3gmnkoYN\nanaMbRc4gnuA42F3SzO7lNAbFjiUqjb6OI980qTjE+fxCfX20TxfBrTmNZ1AgALVNviwhcQiqGQn\nWngCLMYUFaZVKSg9ZbvD8sFdBNkCupeydeo4nrH5HMqOwhQGm5doNOHoGEtRRseUFBZ6JmcpS1lJ\nuxifIVSBMDna9qnJlHrZorPvUSLpaS1u45rrxuEFwJ1w6mklahj8Ukq8pCqElCXTUcyWiYAzto5w\n4QUzvO4t5/O291/Mr7zvQradsZFQ1gCJUgLvQctq9UdJSuHpWUs7z2mnGZ2ixCDJHBS+ogVX9Jin\nlrv8e23y9gEdqjHF0nt/zn/UKq9KHTxTkxMcf/zJzB48zKCfsuW4jSyvHGGk3kBrRaPeRAhBt9tB\nBwnfu/5Gwihg1dQEzjqWW21e9KIXsn/vHlatXsXBQ4/zjasTjj1xM2d+Yi+8BH7ttoPc9dhh7t4z\njdq5n2e85IU8fMt3CR/fx+qWx4WCPPJoE+O8Q2iPzwO6OmZqcgvl8kM4o1kKDUnhgZDQ5DihCDfN\nkNYSkjJASEk/cChvEU5RaksooedyglLgu47nvfhl3PPZRxm4LrGs0f3+bRwcLOGZZ+HIApc/t1fJ\nRC6DH96UkJUlVsDUqin6rS5eCGxRoKMQ+yRWWrXpdcMPiffgZCUd2bsn4PQTCtgFenwZvxwT6gQ5\nhH8k0pOFEWWWUqs18CXVGPPA0p89RD1apnnMCey5f5nnbX8xHXWQXjRLfXyaHTvuYP5Im5qL0Faj\nnCfLC7wxRHFEpAPGw4QECJTClCmmmGf+4CE6RcHhXSOwfQW+BKe9pURKgTUWISVCKpwzbJ6YZFxA\nlEM226GTeOyCZGpkE3GQ8vI3bGfhcIvPfexmysJUED4h0VSdey8l1ku80pTOY4wntQOEh1grpIBE\niUqL9hTHv3dl8cBzvPdneu/PGT73hFXe8cANw8/5Z1Z5LwD+QjyZJDE8jt20ge1nns5xmzdy+OA+\nanXNiScfy8hYjenpVdTjGiEhRZEyPz9LZgxf/uo1EIZc9LwLKdIWv/3ed7Jn1wOsWj3OgcOPs9Lr\nsf6YzRwzcwwN/yl6l54EvwLipXDm6Z/g2Gd6pIh4+MufxRYVxCDXqtKWOUNbCsKaIpSCseVFvvG+\n97L02G5yadAb1tAwktgD0mKDgEAEDO68hR/+zju4+XN/jjAlganm1LtrN2IHntxBKBKciomyFnf+\n3cfQSuFlUE0pdg+hbZfS14iiiNOf24dvVcFy040xzlZwDZ0ECOkq+Ll3eFtV8J44qg2/HYJvql6F\nNSWl9OzfGxxNxZqTK/hQkaiSwIeU3hKHmlB5UBLpLCOBpz6WEDdWky6mhJ0F9tx3PxOhI9uzl6ml\nMU5wW9GLfS6+6AVc9pormLUrLPsBaSlRMkZFDfo9y8J8l4Pzh+m3VxBFikeQ+oL20iEGPcvKysmw\nHbgbTju1ROKHchcIJUxGilEpCGWJl45YR5iup78k2HH/QR59ZJYf/OBGjiw9zKvesJ0glISRQQqL\n0p5AKgKhCaVGIyuYoJI4oXEyoOs8HQNzacni4P+dNuyfJ3QvobLIY/j4suG/j1rlee/3AU9Y5f3U\nsXHjpmGJ0LJu3SQz66ZptY5gTbVZNThy75mf6zE/36PXFbzspb/AJS98Cd1BwPiqLXzjm99n9dpp\nvnvtdWzavImlI3P0O0vMzu7n+hu/z3e+ejnd3xiDE0G/seSCl30evykn/fZ3WX1wH8K76gw46EQN\nXn3ll6hd/jLKMEHbjA1LPabKEhkHtFsLQ1cxRSRqaAtpkDPV67H58UOsHbRJjMVhSFcfy3G/+RG2\nf+ATBCJAiJJQeXTuqHkonMFOTzLQCuMNpXcYJ5jZXLBaGZiFwclw7x0RygPeo8KAKE6OmgXZ0hwt\nH/9z8g38JM0tjWHfnmGw7AQ53UZbT6mqkV7vPGCIAl2R+qWnFqtqyvOkrbgwIJ1rUy/nOeOSc7nv\n4C5KczfPuvjbXP78XRy35gZq8Xd40zvOJFzdZaU3oFUWpAYWyj678wUWvWVva5m0SAmEQ1lHiGLd\n9CSnH385/TVAC1Zrx/QqX9EzhaXmHWubDYS3lFJW1UkJWiQ0otUceGCee27ZQb/VI+12sOEh3vP7\nv8zaY0dRXiCtRTwBPhegVWXSpKUg1BIlPYEAKT0yUJTy/7A+/anj/8vKcr0Q4i4hxJuHzz2VVd7B\nJ732X7TKO/GUU/j8F77AYNCnSDM6KyuMj4xi84JOu09aWG667Xb27t1DFCaccdpp3HjjD5g9cIC8\n32XzpmnWrJEENkeXPfY+fD+dxTkeuOcO9u1+lCiUdFqea796BcUnQ5iD5A+6XPrZ/TSLrNrQIdAI\n0IrYelak5+znvIhOntMNBFaULCUOU2jivPJcN2HEshAsBzWSoEkeCvLYkUlPVyuywCJ1ybQo2PvQ\nD+n7rPJWwVIIS5c+SoesS9ZSFCV9Z8i8r0yQLuxVPpiXwp23J1irsVlROQ4oQX1iZDgGrJFaUhTl\nUdLN0T/osMz6xJiwc459ezWcAOyC2syAfGWZWk2hdISzgjAWjNUThIwxQLd9kLpfRizsxdOhGM9I\nI8NCP+fhIuVZb7uTbffuZdvXHuaiex/mjcVuXlS/mj/5k0d4yx/uYs1xPdoDw4Ful6VGydLMHKe/\nbj9rXnArfdMmsI5Bv4NWgtn9j/PYrjqcCdwNp55msdqDL4cuypYkCUjCClBROocxGaNNyaU/dzYX\nXnIGW04+jrVbtjK59iTGNqzijz/2P0iSFMWwMuctzpQIa4i0JAkUgYBAV4NikZLUdESk/9+Ujp/t\nvT8TuAz4dSHEBU/+4tCD5al0M//H1977nvcRx02u+srV7N73OAsLSzz6yG5S41nqdNizdz/9fo8L\nL34OP/eyF/PAw/dy0rZNXHv11ayZbDIxorBFm97yMto71q4aIRAlY8069VpIFFWnquhv4Pabfxn3\nFQH/CKPfW+DkDy1TElB4xe5YMioSEldy1Wuv4Au/9jpiN6B0HGX2J6UmTKsusUVwwXvezks/8Rd0\nJ1ZT9zFGKLAB0kiSQlI7fIgfvev1tL7+JSJrCJSk1xjluNf/BiqUpELAzn2ouALllSbHK8Ppz8uP\n7ldu/kFMKH1FfjcOTSXzQMvqbmkcYaABhxfVx1GM2hNNu2Gw7H/SypLrA3TwLByeJXI9lM/IrEeK\nDO8HPH5wD1meMntgB4uP3o/PV5ChoCY0xeEFNq07gck1Ar4ITAA94Hrgg8CJcPrnl/jt37qXX/vg\nA7zk1St8/ONHuOojR/jFlSUuay1z1i99n3Z3iaDo0jp4mI1rjmFxYcPRVGzbaTnSeeo6IFYOUxaU\nxoI1NGsR8UiEr3see+RuHtt3JwOzxHKny6G5FnnRQAermFs5wJve8ja89kSBJNSCIFBDFYDFGoPy\ngkgG5GnBcqvDcmuFXv+pO/j/rg2+9352+LgghPgqVVr1H7LK+5XX/jxpnjM7N0vcrFFvjLBzx6Pc\ndccddHsDtp9+FhunVrPUWmbHzgdpjGjWNWfY+LpXMX/kcbRWtNoVCURIgXMlzWYDay2DQYq1njAI\nCMOA2QObuH/sBZzxN9+Gt8DYd7okNqW3eZqf+5OPcPcb3km3O8cqmyC9xsrhrD6OZilIVYmXCict\n9bIaGjoSadxEk+DQLFqFeJURiAEFTVxZkviicuzCMrAJJ7/stehzX0b7i19gqmyxUpeoQuHXTzGY\nfZzJGE45JztaMr71UzGlK5msadpFl4nV68EljM9M89iPdxAN53dKwLuhRaDzIJ48EFdVyfbu0XAc\nsBfWrrP8w90/4MyZbUyPKLTNWVg6iNcBVoeEgaNbFHg8S+15jswdYsNZ28hshvMlW2ZmaEbDqPxF\n+P6RdYzXBeOTAzY0VhB/CpwGZ71mmbPOp/KNu5kqaf86nLVzwLdqh7BLktBr0vYY/c4WOOsR+Cc4\n8U0lmoBIVWVw7as5/aAmqI8q1GhEHnlKA+3FDkW/JGk2EFJxJN7Lls2bqY8ucfkvvJUvfOnzmE5B\nXlikV8OUU1XnyFZ+pGMjI6hAoSOJQrO00v6/DxYhRA1Q3vuuEKIOXAr8z+qt/99b5XUGKbkpmdm0\niebICLfddjsXXfZ8vvi3n2PLlpPZtGEVM2vWsmrjMRgd8PBdd3Lqtq0c2LeLuSOP0+1kjI1OEsaC\nohigdUiel0RRTLM5VtlRCIG1EOqYfjeBNnAStHZHaK2ws8uk3Yz66BiyG1IiMAqcKymtZK4e4nKw\nykAgiEUAEm7/+KeIknGiwRy5NMRWU0pB30MiJda7yuLBOUDRcB0e/v7XmcEQm5zYCla2HIOfKzlu\n+7HMf/sIp53XR9/lYWtVMp47FBDFkhNO2EDPWRyKtJ8hvCeqxxTdDIxFBgpjHUqrnzJBfQJACDAY\nCPbNajadbgj+BD70hRZv/oWHMG3Liy84j7iEfpZjsxW2bt5Ez6QMBo6DRxbQWUlDeKJQk9mSsbpk\nMMQTAdx+/Qn058cQwrFmZsAFVzzCtnc9jvgI8LfAq8H+PfRcyGhcwF/CM15+gK9/LGa1VvT2KWZO\nHxluvkDIAIMgTGrg0mp1FCUqUgR1QTQqK5JtLlmnJslMielneC3pt+boLN3P9ESTVdOjOFMVRKyr\nmrlhEKO0BCzKR+SmZJBmhEoRyqGj21Mc/56VZTXw1WFerIHPe++vFULcxX/AKu+RHTspJMwvLtLr\ndvjN33wfY1OTXHTZxWzadCxL80sstFusCjbS6xyhyFpc/+1v0u138R7GJxr0+y2mm+sAh1IVuqfI\nC9JBRpIkFRTBe0pj2LRtb3XX3g61xkXMbdasenQPt//6r7MqtVgCHBVD1zuBP+eZvOiNr+Uf3/s/\nseYQVtawVlJoy0g5IEn7ZD5jEGqUy4nPv5STX/RSHvjUJ3B7Hq3epAAvPamUNHbvZ/nwJ4mtpJ1Y\nXC45713v4ZaP/xYTzjO+YYg9OgUe2RmiBAgpyIolpA8Y06NkuaPjc0469STuvvNuAjTCC5ynciDQ\n6qcsOLyv0EbWWj7yh+N89J8WUJfAiZR85soV3vJLq/nKTXcxMzlJq7fEZaechCwKXFlQSk230+OE\nmQlqUlOoOjrPqSddkloA5NVb9BZpS5ABi4ebfO2vz+HWmeN51iseZfTNfR67bx0PfegYJtf1eM07\nfgDPgefu6vPdr0c8tHeO0cxxrizgMLAOktpxvPyKS3jsntvoze1DBQ4ResK6QicG3XBI7dFa4gpB\nvdkgLwqMB5uWmO4hZqbPw5qYxkhEZ5AhlUdphXMl3oCUIFCEUUh/0CWuJ2jh+LcE+P9msHjv9wJn\n/AvPLwOX/Cuv+RDwoaf6f2UQc8fNtzOxaooLLjiPhx98kPvu/zHHHruZNZNrKPOUqdVT3HHzjezf\nt48iy/FAPaiBqCbcVBTQ68+T1EYoCks6yDDG0m736PVzwjBEKUVWOCZWz8HdwHvgvquXOfFN72LX\n+9/PWFYBELzK8U4TmAFaxyy2VwhVQBqCKErUqVuRU2vh+uvIQw3OVoZC3lLqGsed+2zmm6OMbd7C\n7IH91HJPNwRkSjNXFJFAZg4vDaWt01xcoPONq5joOOZdydxDCVwO/CVs++9VGjRSjxgZmUZ5jyks\nVpdsGV/DwX6HTZtmOHhgFmMDFAFa5hSlrWiN3uG1HMpeqhvJDd+N+e3aOB+6fmUYMIZP//0RXv+a\nCXavLGMCx32HFqmvDxkM2tzdyjjcanHathKXrieoge96Zk7dS6P5E41aLdbkNY8znsJItPIcOjzC\nNZ87Des9cXOMPBuwuGOCw5ePsm57m/BKx+9+4sc8ev80ux4MqTfnYBZYB62VJgjLxmM38uDsQXIM\ntUQhopKoWcNKgS1LQhVgwwIvC0TgCYQiMAHWzzExenLlP+NB1SzT9Yj1G6cYm6wxPjHK1q0n8s1v\nPMztt9zHmpk6pjuKCVrIfyMcnja5y9333kevnxIkfa768td4+SteRj/NGZ+Y5JZbbuWcc87hmmu+\njRn0CMKQOImrtEYqwjDElqrC5siA/Y8fRqkApSrKYVyvsbzcotudQ2uNdQXTa+fhHmA7dD+c8siV\nH2TMVENOTkLiQpaiBjYYEDjHyP79/OOvvZUxK4jCJqe/4j9TjI6w96FdRIcWKYOcTIBFoso+t/3Z\nn6In1hIs7MUEPZwRhCLG+ia90BBah5EOjWSAQ5U95u7/EdI7JscneOjHS5TbINgLmyYM46sMxhYs\nLRypuvbGsm7NaqQyjOmApDbG8vwc3gb0B0VVCg00uflJufSo+9YwNfv6V6spxA9f30JdAic4w7dv\nmOf66xK++Z0G99wwx6HZIzhtaas6A205NOVoRrtY2zzEua9+jOM3zMJ7gAVgGg4s7qa1MkLIKKNB\nE4xCxpocixGShVabTn+Zbt/wxX8c4d3/rQ0vgdGPDTj70n2c/bx9cC4VG+1UWJztIMwSq9ZMoEfr\nZL1ljIDMC9p5n6BW2ZSLJ+zUpQDr8dLhRBeJohkehzU57/yvr+RLX/0iKnCYcomF9jJL7UWCsMZf\nfPJvOHRwkfe+8420TBtrBBOTE095zT5twbL/4GFO2raNpNHgPz37Cqxz7Hjku+RpyaYN6/nSVV8e\n6p0ERVGBBSq/lpKyP6CfZ9QadXqdFcKoQZrmmDRlYWEe51ylXBWSLC8Yn1gkPmyhCZ0oov9Yn03G\n0RkSMJ3wLIw3uPijH+Huv/l78u99j1JmTArJ4VhhEFz/O+9jevt25lu7WKdDnAgQOJwQuACarouY\nH+BUSZDX8IS4c5/NST//ZnZ9+o8IH36Qri5JCo0vLZkfkKgAtCYxcMRKdu6MOeUZGfwIztxesjS/\nlVNPOAYloR4lrKysYJxlsNAhieuce+7pfO+G2wl1Y8jasgSyQncYPM49OS2zKC351jVNvPf8wfVt\n1Mth9DPwilemvOKNKa0/hutuSPjOd2vMPZpVPQ3Z4oyXzXLGyQP0nwGfBN4A7iG46d4xygKc6NFd\nSlk4uIetq2bY7xexNUcwWUdFCR2h2LMwz4//IWfd5oSX7k2pPQJcC/wRwwQeeD2UDwjkYIl9B/ex\nau04jz0yS2o1ohQIQpz1WO+wpatsPLSi189ROqRed2w6ZhPOKj79qXey6/HrICgJGnXGag3aKwaT\nhzz2+C6IdrN58zau/Kdv8dKfuxjbMizMLzzlNfu0Bcvmzcex6djj+MEPbmLXrj0cOrifN7359cwf\nPsjBg4fp9QbUajWEE3jvKItKZJkPUtIsY2LVFK12h7yfY0yXPDeUZU4QKJCCNK/GTQvjOPaEfpWC\nbYfHd4+BCekHBS61EFQkkXpf0M4Ctp33bO649VYK59ClZeANIZLJWOLu2MF4EhO4Grku8FWXBm/l\n0B7WokqNDTLwOcefdCIDp2nFNdY4KEVJzQUELmMgNNrHGCcIuxlJPeb+e2NOeXYGt8Bp23Nuulai\ntacoLe12i6mRUZbSAVGUMNoYBee58DnP5LYf3oezIJQj1iGFs5giP7qiVPsWT71eJ8syvnF1naKQ\nvPfLbdb3HHwJ+FUYa8EVr0y54nXpT7pmUJWGX0FV2rkHdg3qfO8fxpjdG9Dwgr5xyCxlLLdMFp7l\nUUltSx0/WYBsMaYjRjckONfgpgc3cftjivXTbU44f5mTf6nD5tEB8i4YnKXY+bVR5CBnVEmSmuaM\ns07j4JG9lC6jl3uK1IFzhKbSx/VbfWQQYsqS0lhWrRrn7b/2KpKpOerRWpRSzC5k2EmPEBqHQ+mU\na777+7zqhZ9F2HE+9ekr+dU3vIKefeoO/tM2VvzG1/wi27adzvzSChMT40QBPH7gMXqdNuEw5XLO\n0WiMHL07SqEYDFKyvCAKQwaDPlJIzFC6jqgmK40x1Ot1lBIMen1e87oHuPjGx2AErj/5BB7+L4oJ\nHEvG8+K3v42rP/ZJGq6gkCHgoXQYbQiygAcTSyIcawoF0lHzilAYBkpy4e98CJNIvvG7/4NmaQid\nJFOeQARob+mHIcujE0wsrSBsjhEdwmPPpnnGWvZcdT9n/tUH2fHWt6JdyGyQsuryNh94yRx8GB78\naMhff+z5HHtMRBhFdFZaJLUGY9PrWb9hPaOjo8xsmiEMY3733b/Hjgf3Y6xDGI8Rkp6zFMPu/xM+\nN9ZVe40nhsiE92w7LefSywa84MUZ69q2CpxrgO6T/mBbgd+D2amAG78xwe6HIxCCIocsLem2NeX8\ngHX1E5HJEv2NA0bGFKNjIS6M6PWzIQ9MYlXdG69AAAAgAElEQVSAqI/gozHK3KGdptEYo3fg+6jZ\n43hsd59OnGCdQGcpvZUWJuwRNUoaY5pIg1YS7wzCKHJT4IHCVnT98bokSSAZaVKWJQttS6evMJnF\nFA4h4ZjjGjSbjjwNmJzaxMYNZ3H3D+9g/0MrXHvzXf/qWPHTtrJMTa7noQceZWbDDDt3PkAoJcZU\n/hte5Eip0FpVnhzD6TlrLVIElV9HUZAkCQC+KCpFrvgJobLf7xOFIYHWrN+4fHRz/+MH6rTWhtT3\nLVITEYe3HM+mN/08fOIfWFGGUnicgdiVdLRFO41TAi8EgdDgDU4ISkLa9YCwMUKuYjRdMpFSzyul\ndyFLktxw7AFLu9YnIqYbreasX3kXbV3gv3kv4tB+IKCjAnLh2HVrCB8G7oITtxYce/xqLr7kOaxZ\nt56xyQmK0hAlSeVtg6Pf6xCHEX/813/FR/7gz7j+698GW40MJ0kIeYF1DuMcQvhKWf1k703v2fFg\njYfuj/noHztOOa3g0ssyzv1fGRPjEVpXqWavHXDrtQ32PKywpsAMqrEKm3tMryQsJRbNzsV9HHda\nyOSYpjEeUR+JSAvDxESdsvRYIfFC0xcWIUpqtREGeYfMKJbNcaws9ch9jCur94hwiFpCnqbYzGLa\njjgWxEGlpiiyEqkD+oMUoRNsKakFHq0dPi8pXJUmZ2VJZ9HjixDrc2pNR7czIKmN0tk3y9yRO1g3\nM8m+nfP/wpX6k+NpW1le+rzLWbt2DUEIxvUqC2c81lX+IEEQUpYFOviJfMNaC15ijEVK+VMOxUVR\nVCmYqCzzvIdQV6rVP//M1cQzFnbCjx6Z4Z7vT7P8JctYCwpfJ5gKGZubpcwMRnpyX9JKRlh97nbu\nvuWHhD7nGFdDSEikJ7aG0BsWG6NkgWKk6/G+wGlH1yRYGTGqDRQ9MqXRRR8jI/IwQJ39EqYamn03\nXU2nVKwWA+ypz6QxNs59t13LB2/YzeYrSvgULG35M8bHn38U4O09dPMBcZwM51Ic3lisC7BW8rb/\n/BsceHQvohSk3qBViFOC5UGXyglvCFMXFYD9iSIAgJYQRRFKKV71qitwxQCNxBkDNsPmK8wd3kOW\nLZOImDIr8c5VJdtCspI7Bt5xxgWKVdMwMd1AR4rSSsqyKsyU1pOVnpYR2KCJsCGFz3C2QZm18WmN\nB37ocE4OrRIt1jpK0yIdLBOEENckoTaEga8c3KygdApkwNTEWmy6j9FxRzISEgjJoIReJsjSEukV\nQnlUoECESJlgvKpgj96zedXJfPj3P/uzt7Icf/yxHDq0j9HxOrjyJ4204WxGnmdH+yZHZ7ClJMvy\nqrNLUI3aGovW+igrt1LiSryzFGWJECXaVJt7LoRn/dwhnvXiQ+TvlDx6a5252xXZnQ0Ot0JcUQWf\nCSTHXHApU5c+k9vuuJt6WYkdE6cogKI+QmBgui9pBTlWByinKIXm9Le9HbVpHbf83gdZ2++jhcfL\nkCw0NG1OfscNHBEpbfoEhPSVxdc18wuzSCe4/66YzeeXcAuMnLWbHI+SQ0SPFyRBnSIrSZKEIu2C\ns3T7HUbiOn/x2b/k5ZdehjEFEVAUOU7BSBLTTzNsZdRZBY53PFkbK4TgzDPO5LTTTsNaEFGI8Bk6\nEDhbwdOnZ+osLc6xNDtPPhhGnLUUtlJvT0ytohbPEscpSuZoIsI4xieOtCjRvmqcRs5TWENR5sOV\nsIe1faJmQqELVBEjpMF4j1EKrUZo6Igi79NpLxMnJeOjNZK6xliFN5qRkVWMjaxnMT2ECkqCwDCR\nNBlTIYWVdHsZHkfpA4pSY8oAK0KEFRhnyAeOPenep7xmn7ZgWV5cBm85fOhxRhpNVKCJ6rUKVOAd\n1prhNGUFH3hiFl8HiizL8EMvxCiskef5T600YRgiUBRlinOez191Cq98aAf1nbbKx98B0QHHKc/r\ncsozgfe14AxYmdcc+XHEkfsC9j3wHQYPe5oWfBmiFVhhGTvvXLa87IXc97mrWNrxCMpZhJfgSzIR\nML5mE4OySd9BqgvwCVoKEhdC7rBxB2cddRVSCkdaBqR33EZLgiKn1doG598GV4J+930YP8B7gXSg\nZYCXCXEoydMcIQKipIJO9Dor3HT9tcxs3sTuhx7Fe4kRFm88sQpJmmMsdlM8Fmuqjf9RAaYUvO4N\nb0SKAGQ1tWqdRRNA3ifQkrJWDVKtW7+FVdMbyYcMskG7i40jwloNpSxR0saHKaXPCZxDlB4Rh2gJ\nxoEQBi01eVY5IDgjsaKPkJDlHcLAY/oJVmYI3STwDmSASAKmJiRr18Y0xzK89VBIotooG084k8Nz\nbRYOZkSxxBYZRSpoywilLEk9YVIleAEuqNEZOFqLGUXhaDQnWVycRyHIy6fmhj1tweJdSRSGjI+v\nR1iLwdNaWj5K9piamkIpRRzFDAYDlFLDgPFEUXL0uSeOsiyPKm7zPK+UulikFFzz1dV86+ur2HZK\nm2c+q8XpvzrHGjeoSpd3AV8AHoTxTYbx7YYTt8OF72zBmY/w3Ldq9t0bcc+nmyzepdh9171su+gS\noolxysJhY08pHZFVNL3g6vf/Dk57Vrk+XoQYEbHsHK7eoGkzhC8JhCS2nkIICpehvEKXBSqAe64T\nvO5jVCgjfy/aKQIZVuTENGPHj3/AXXfdy9LCEvPzC3jriYOAQb9PEQSU3rF+y/Hs3rWvKidLSWlK\nSucYadToD3pYX4AYuq15j0KydfMJLC91ybMKnD1wbXQYUNoaQoG0OZoaNi8IE4kKq7S4OTIOukYu\nLN6meJfgbYAzFoNFBAZhJFpWl5pwDmkdysmKI2BTpHC4wuF9QJRYbNuiVYT1EAYGKTyr1xSsWV8i\nVEpuUoJag16/oLewQrt7K42RkA3HriOUY8wfGWCyhPlly/SqCCFyalEdpQKkDpBJQBaCKT3ddhsp\nBZnxZN2fUcjem694FUI84WoryJ1BhkHFwnLVStJutxn0858KHqnAmMqMM0kSvJMYY4iiiLIsaTab\ntNvtKkWzBcbkBEGAtZWzlxaSUCpWr+uxaethjj2+x3Fbu2yY6aN2UhUC7ho+PghsAi6C3n8TfO78\nGZLeKHkhCCXkaoARgpAaVpQIb8ltgJISrwyZtwgZc/K73o8cq3P3Jz+GmN2JNBUveV5Dx5Yo5RmQ\nUYgQFQR86q59TD7DwrVw1Y9eyZ5HFfUwrrCtS0t0OznZoMRbz+jEOIU1pL0+gywjzwvm5w4xMTrJ\nodkOIIY9p2psNYoiEJ52L8UMXQy891zx0pdzwfkXIrxkeaVDu9+p5macxbgS7w2uNPiygkl4ByiJ\n1tVNzCiDNwb8I9RGHyROPJE2BEFEEGmM01ivSFNHbyDpppJunmJdgZCSXmrIBhE7d1rCYj06tDhC\nIjFg7cwKx26WFCan8BGprXFwto1dLigygWfAtjNmoBawcjhlZGKaB3fuIAg09VgwlgQ0mwG1RoSS\nMUWqWOl7lldyTKmw2tPpG3w2xg3XXPezt2fRsprmQyhyX5kX+dJUJ384Cz46MkKtVpU7s3RQ8cLw\njI6OEsch1tvKnsCVlJlBByGDIqfwllCHWO/RYYgtLc2kwVJ7iVAHWGHZs1ux59G1XH+NJYxDRCCQ\nwaOcui3nrBcLjnt3n3UzffRO4O3QuNJzwe91ue6/BNV4rB/aYtfGEGuOobu0SKPfQeAw0iC9RziJ\nkQXZcsp46Oh0lxm1Fqcs2laQvUIKjIfSB5QCIlOy8+4a553fhR9Cc2IfRb6FXruNcJ7SKOq1UQaD\nJVKT0T20j1o95nkXX8izL7iQ5SMrXH/9jTz84CP0UiokLpXuKVAaW5SEYcD0+Ci9QUY/TUForvn2\nd7j5h7fy6le+ko0bZ5ia3szywhLtdpdQKjLbwwuFiuKjujMAqTTS99DCUQqDZw15cRjPEXRT4ooA\nJxx4hXE51ni8U+RZgRv6zxjv8LlC4Aio4VWJR6IVNBLH1i2WKEnxeVZBDhFs2bqRB2/cTadt8Fqw\n0s5oZhoVCoJmwpaTzubh++7EFhKblWRFwIiBQDjKQURrJaOXVuPXSE8tqXHauS/ihmuu+9ev2f//\nw+JfPoSAWi2p3GZ9Nf0sZWVBgBTkWT6EF1R3vrHxUaSUQy5uTqvVwXtPkjSIknCYhlWcL6UkeVkM\n4XOVfL3b7xOEIcZaojAEJ5BUq9JgkNEdZJR2gn5rlNtvMYRaMjYecf45R3j1Zx6CZ8MZ97Z55FlN\njtyk8KFG25LcGLa/9fVkSvC9//4BJtopReCRTqKdAAcP/u1fYlXBeJGhbERXOQpvaQnJnDT0naUW\nRoQefCD58V0x5z27C7fA1JseZ2VpgiRJmFmzlsIpup0uI6OWpBzFWUe7dYTvfusGvvSVb5KlDmNB\nyQihFUEUV2ghOwTxAWVRYMqSRr1OGIT0sxJTGtI05XP/628YbTR43gtexOmnnc7UZJ10kLK47MnL\nEuMV3hrw8ii8W4gQrEUGEuFriHwjTpTkgy6BhsLmQw9RKF2A9Z4wivFe0Or2K4QSI7QWGxWjTUmS\nWp1mM2DzRk+tcQQpFKvqUwxKgcwdvXyZ488eJ+trhI5IGoLO4YzSF8zunaMxso64NobNc7KiRCqJ\n8wWRgEEvo93y9AoPOiQeVYyNn85A/Iza5L3jl18LDMkkQ2l5mqYYa8nKglqtRr/fRytxFLr2xH6m\ncrIV5EUBErI0I8tzSmNoNCvARRgllFmGc46iKJBaUZghtb+sKmhaVD9/cXERpGBsYpxYBZiyQAgI\nw4B6GPGr77yT06+dh/tg8Q9D/vE5GymMQ2ERss7S1BrOOO9sHv7m1cRliUQcRYwa7wmEJ8VjpCR3\nnr60LEvBEek5YAcUShBYCKzAYTnlTMdX/mgeXgmdOxv83SffwK4dj/D4/gP0Cz0sk2uEDKryrxRo\n78m8QwnJEE6J9/Zo2T3r91HD6peUErzDA1GSgNSkaVndmBB4BCqQxHHECy6+hDPPOJM4qrHUarG8\n0iMrq+lNay0/oSsYvDR4W6BdhjH7UeECzi8gYg8iQ9gIKzSDwpLmIb1OQVpICp+T9mt0W9MoBUom\nSAlapZxy8hKrpg5TiyZQQUJRwKCAQWmGr9N0uoooHsGaSWyRs3SkRW/gCMYS9u95jEYYEYWaMBQo\n5ygGhqKoMbAeWWsQjoyz5eTL8UGdv3jPq3/20jApJWVZMBjkqDCsGo5SVoGgFGmaEkURtTAYlpJz\ntKgqMtiK2i68rwatwoAkDCitxSHo9/t0O20Cr4lrNcIoYpDnhHFEv9dDC0WRDoiimM7yCiMjo0Sx\nRlAR3IULkNLispK+9Vz1+TM44QPXEZ/nmXqo4Kx3L3P7H41jgLj0TM4f4eC3v85IWSCtpKgpcmso\nlEaXkCMqHI/09ITDSc2cKNhvcpyK0FbgBBRKYETJ/Ts1/U1QX4GR5R737PoGjzykccahwwAd1smd\nRXo3hFBWQamReOdQApy35G7IUxaKkdFxikGf4ijVsoLLlVmODD2NWFEaT1o6UmNIRELeGnD1N6/j\n6qu/xYte+ALOOvscNmwcp5emLK2skKZp1QQWCuEMqBJBgFAC7U4iy8bp92JCkyLDBYSsmoSFE9VY\ntNdYIOtPYdIxkoZEuQCtE6RyOJ9RqwXUaiGxhiC21JOIJA0Zc56+SSgjTYxgfqGgrPVIrcE2AwgN\nziuQMaVVkFXN7Ir8GRMENaLQUuhJNm+5mCRuYJ7kTvAvHU9bsAzyDIEiShqUeYqWlUGm9x4toVlP\nKIsCaytXqDgOSdMUKTVCSIw1mDInCKpSsfcCKRTGWkZqdbSQZM7QH/RIsz5FWTAy2iQYdveaIyP0\nWm20kKyeWkW3u4xSgjTtE+iYwnmMdZT5gMHeGv/01RN4zWcegVfDM+9d4r6bJlDt05h74CHGvUeU\nIIXHaEfPh7zgIx+nVxi+8f73osuM0ngGwtPVmt15RjvylFoivMXIJ4ATlUVcVliuvaHG5a8awN/D\nJZe02PnAGsIQvBA4Y6sigrFVzm8KtNYY6RGigheiFFqBesK+WwjGonG6nS5ZkSGQWFF1WmxRUChN\nGIZo5YmNIi0qiUqv30dKxde++S2u+urXuOiii7jkoos4bsM6Bv0+S8vL9HKD1VVzzyuPkxUkfLTe\nwFGn092Dtp4wSCl8ZcOd5oJBUafbGyWOxmgkIUr6yjvUC6S05ANDEDgivUQkJRqP0CmiDs5LolLS\nTx29xNMYh3ZhkURIbXGDlCioUYu3Muj1cAoin1XWeC4ho0mjMcHMmmPRYQPnqv3YUx1PW7CAq3oo\nxoLzqDhAejDeEKihz6DSZNngaFOysnk2R6mVWmmEsBSFQQqFFArhLVFQ7WGKLEdrxcT4RLXxdtX3\npmnG8uISjThh9fQ0rZUVhLQEUUwQVsNSxnms9wRhiBCCK68c56yPjnLiK9qod8OLP7SPDzyvYFR4\nRBigvEQZQS48RRBzYNdeHnxsF1luCJSjl0Q87kpmBylZEpDh8F4iPVUxQAiccENrc89Xvtzg8ncM\n4NVw2e0t/urj6ygLy7ACi7ce4Ssv+Qq2V51PpSAIgiFA/CfKh9I4olpA3KzhBlCkWYUFompIWmtJ\n06ocHwQBgQ5o9zOyNEOFlUIgDEN+cNP3ufGG6znz9DO49NJL2TAzgxWKg7OHGeQ5Ell1x5XHupzG\n6FpGRyfIixXa7cMszB9mudPCyZjxyS3Um3XCMEI6T5kboriGdFTq8kjjTadyr9YGIRyu1HhfIoMQ\nHSmMlSQxDPKSwFkGLoVS4SjJrCVsHEM01kA4SWELnPMELiJRqwhMn8RHFIs94g2r8P5ndJ5F6xBT\nVmViFWjyQTrE94AZiiifKGs658jz/GiXP88L4rgicXig3xsQhglZ1iMMA9I0HV4khrIsiEebCCUp\nh2Y1zhkmJidoREnlZ2gdznuy/gDpNcZ5cJY8z0mS6nuCsMaf/vnxfPzDdxGd49mwu+Sy9/a4/o8m\nKJ0hKiURssq1u4vc+ZmPUChBQxl2Ws+86bOkPIO6rtynXPXbO+ErhsA/s1y45eaI2T+WrA0dEzsc\n289e4dZbRnDGEGiNlGooganS1ypAqpuIUmr4u8c/scyWljTLUEoR1WvUopjW8gpQBUsgBErJ6oyW\nJU57RhoJWmm62YDM5lQB6YjDgId3PMD9D9zD6tVr+NVfeStrpyYJ45j5pUVWem2kDCm8RyqJNxFR\nXGcyWkNz/JTqpmUMpW9gsVhbIp1HNgTSC5yzBHgyLUlTiSDGe4t3Q+yvKXDG4L3GeYlWIUpapDco\nryhLiUaSW0tRhsA4QRgiw5AoCpgMIm79/jW0Fh4jUobpdRu4cP3bUeJnNFicrwR9zjocDj+Ukhtv\nwFd7mmy4QX+yzYAQCqQfVr+gLAvqjVECHYMQWFMepa9HUURRKJI4prQFSmo6aQ+lNGEQ8r/bO/dg\n27KqvP/GnHOttR/ncV/ndt++TdONQKCB5qUgKoIiLYLVAoqvpIpQmpSSEk2qjKKxYsVUtBBjUvGV\nSLQaIiooD01EBQVpVFqFbmho+qnd9O37vvc8995rrfkY+WOutc+5t7svtwj0MVVnVJ06++x9ztlz\n773GnGN84xvfqNsG24V0MSS0C1mlYzI75+agwqhYon54P+9895N4w2/cD98DN912jpvfV3L6PsuB\nIjI2BYKQdJlIw4l6wvkSkg7whSHisZqQZLABogVveFQdxJSE9753gTe+fgNuhle9boOPfniBonII\ngkgmbMaO32WMRTXhnJuPa+hZD95nWVRTOGKHEFprWDl8mPWNdeq6prL5dEI1Ey29ok4ZDUuKwSJb\nky2atu42MIuIpSwLzp47y3/5z2/lwKHDvPq1r+aKK6/k4MGDbM1qzm9uMmtajGmIYrDVAuMqK/23\nKZLciBgTbVMT6ibvfMFjxNJON6nsCD9ZJsVVAhM0KMa0JIkYJ6SYiMFCEIhNbgu3JVYskgRmIF6J\ntCQ85ahieTDmrls/Sjh/jEXbYN2YQ4efiaQcyl/Kds1ZfMq7uTih9S0hCEEsgkXV4+sG5xzW9kLh\ngrGWEBSViI8NYDHFmPWtLRaXwJUFdEXNGGMe7Fk4Jr5BCWx1mmRXX3UUUnbQJobcKy85/AKPWEsI\nMfPNyI81W1MYDnjP71/Fy//nCa6+YYr73/Csr5nxu/cts2YlD9MR8NRYHGYwyBuBJiSBaEEyoBJJ\nRb42jCrMBSZ2WFLe/a4Rb3znBjwLXvzWTfYfCswmuVVau8a4yhqKqiJ6T6u5MFqWZab/qOBDzLmE\n5H4bRUAsXgSvSrm0jJYVs60tnAgDY7EpktTQhkAKAVcWHByPmTrLrG0xVghENAlGla04ZXr6QX71\nbb/E0uIS3/7a7+Kaa67lSVdcST1rODddo40GayqmWpMsWW0lCgYobEk1iqABbRo0zSjHlhgC6q7k\n3LnzLC80FEXEupQ5aSFBKkg+kUKDI+BMhciE0gwJoSL5JaId5bEUzlJS8MAdt3Py1CcwRrGSof+n\nP/9FJDfAmX+kIyfquqZtG5p6gjWOYTXAWZdzjW5Q5qzxiC2xRW6SClHxSUlqwBao5ppBWZbUszwQ\npwk+V7Tbhs3JOrVvqeuG1fVNVs+e59prryWlvAPHGHP12Yd8IqRMLtzJxoWOSiOCj4H9+x2Hl6Zw\nK/By+PhfD2k0MRMl2IJgC4x1qEBMoRsylBNpMYKRjOIBjxDI60273f2++wo+daaEr4TyA3DjKzbz\nEFRNeTyECG3K4VWgI0gCqevviTFui2ynPNbbx8wU9lEzUVEVnCNaRwCiMURr8SZPYkOV0Hp80zAo\nS5aXFqhc1jTOvLhMh1cfUB/ZWt/gbW/7H7zlLT/HZz5zB2LgqpXDHDl0AENg32CJIpbEGmKjCBZj\nLZCVeKJmhZekA4qqYrAwRrieyXSF6SxSN4a2LYhpwNQ31MHTxEgyWUvN4NAkTBsIuh/siIEbsm/x\nKk6f/HvuveuDDFpFosdTcejodZhqAZJFdrDYH812sShpKUvDi77263j2c2+gKkf891/9NTAlW9M8\nowOB2mdGcgRaHzC2JKUAVqiqAaFj4NZ1jQ8tGKEsK2atR8mJ/9Zsxtak5tDKCrPZjLZuiM6ByFyJ\nXlNiMBhkloAqReFIKQ9MEhHatgW1POs5Jyj/DHg2fHaj5HP32m4klWC6elEy20J35CI1mhQ1WXXF\n7QiRgAtOlZ0hp6ry7nePefbrW7gZXvmWTd71roM4sgK+oFgRoiZEMzXFdeBISgnryu0+fCCIgAoa\nFN8VIX1XvC0KS2kMhTHEkLXYNEWsglFIEvP91mb94CrD/d43GOOIMaHELPJPZG3tHDe//Tex1vGd\nr30tz3j2c7jywH6i5ovu1PQ8RhwxGVIUrGaH1o4866rFPEXBBQYsM9ucMq1hOI5Ug0ibJsQEvoXo\nLa2PhGCpw5CN9iDJHcS5KyjKMcNqgbWH7+XeT32YpdLigqKitMWI53zdNzKLCWsm3H/HPZe8Znft\nZBGB1s944QufyzOf9WSe/ozreM13vJIYalJIWXElgqrQNCH34cdE1EhE8T6ysbGJbzwihtb7XEOQ\n3NtSFQVFVTFrWjY2tzh44FA3yFWxzmWUqx85181p9N7PSZg9g7kfiRFiJITIy15+Hn4L+Gfwnt8f\n5ddCrtSnEDshN6E/oRRA8ofTC3P2DvFoBeGdjoII73vvkPZVwMfhGQdnXP8Miw8esYYkQhDFlI6Y\nj8Nuk5H5bh1CJCXNY+Nioqk9k+kMRFlYGDEejxkMhgwWFjBlQRDBVAXRQqJLjDRhjXQ3I75tiKFF\nRKmqIk/jspbQNETfnTIaaX1N3Ux53/t/n5/6yTfzsY9+hFhPObA45ElPvIblxQorpnv/BKMWawqc\nKzFSYGQIMkRdyXD5Kzi48iK2Nld4+GRi2pTMGksTCupQ0sYBM11ifbpEHY9iiutYWByzvDym9at8\n5u/+iEXTYtThRYnWsHjgKAcOX4exI86ePMZnb/vwJa/Z3WMdm0jpCj74gQ/w1Ke8kZgiX/3VL+Su\n2+7gU3f8A7NGwcQ5KbJpsrSRbybzbsiyLClMxZnzq0TJIijtZEJhHVYT58+3TH3L/sWDDFxBPa2J\nMVJVFdOmxbmcyFtrsV1X5vLycp5t2EOuMeQ6TkwsLU94znUb8CFIvw7vvzF3aiaNWUJVFWssGrIT\n9m4TJJMPjeQiYdpBYHyEw5htbXwBzq0mPvRXFa98bQO/BTd+8xnuv2cFiQFjLYUraOp8GjrnKKqC\nsqo6FX3DrG7Y3NxkcXGB/QcWMIsls1lLiDWqPtdFQsJtbWKGFWXpsGIohxUSI82kJmikiuBMrzVA\nnsYVA0lym69xJbZ31pRQciipChvTXKz84Af/hPe+9328+OtfzI0v/1YOVWP2jy3nzq8yaSPWlngi\nxhoiTS4FEGhTyNOPy2UOXLGIT6tMpudYX3+ItUnLWm1Ym7ZEc5RhtUJVLmHtMkEr6mnNJ//8f7Pf\nGbyf4W2LtUt8/Stew/jKa5mFkunWKp/42J8xml26U3IXKfpZSvOBvz/G1tqEfUdWKK3jhS98Abd9\n+r4u3haWlpYy7aXLMUzRwZvAbDqjlZBDAt9SlQUaoQme0LRImdg/HjMoS+q6ng8v7WN56AcARcBS\nFEO2tiYMqiEx1rmW0w2BrYqSb7pxFfNe4JvgLz9bceKEfcQF3/e3b6N3csFjclEy3z9+wX073icR\n4d3vHvDKNzTwRnj5h87xP35lhdTBqH2bgohQliUhBI4fP05VVQwGIxaXxgyGxQWt2aoG5wpihPFo\njOiMQyjTGNnYmlEOKsblEFdUiBQ00wkhZUSNECnoqploHgAVfBa+s0rhCpAhTdsw811/fNtijLK5\nuUFVDfibv76FW//64/yTpzyNV9/0Kq5c2k+xcpj16YyzaxNCTBhbMGtnVOJwriRpog5KG0YMxwdY\nKJ5ESE9g4eCQfa2wOWlptSRGIVGBVFuj+tsAAB1aSURBVBw/u8Htf/MBhjrDhzVMNUJHR/nuN/wA\nszBiWic01vztLe/nQOV5+Pj5S16zu5ezGIumgo31ht/+X7/DP3/j9yN+nabeZDSq8LFma7pBU2d0\np6oyUtGEWd69gMIWeV5g3VCOBgRfk0IgpsRGaNBQs7xvgdn6BGsryqIgaew+PEOMff+C0LYeI4oz\nBSkBXdy/c5LWjd+yCj8AvAne895RzkO40FlSSl0fjZnXirpKI7CdvEO++5H6xDtNEYE///OKc2+B\ng1M4/HDgK18w5fZPHsDaDK/3NSgRAWtY3r8/306JppnOQ8r8Gl2GSJMgWGazBmdLtmYbXPXU6zg8\nKDh+4mFiVApjGAxHjMcj2jBh/dQ5KixiBTFKaQxROwHytsHYHFKpMQyqXOOZtg2lLdEIYg0xzChd\nBg7uvuvT/MK9d3PVVSt8+2u+jf2LyzzlqqtZrxuOnz3HaDTM+Zgtc+drCmhlmMb8edh9R8EUDD0M\nFoXprCGqJanDJ8et7/k9ZHqMYVXiyxVe8LXfzHXXv5goI7AtVen52J/8MeH0P7A6PceguDR0fFlE\nShHZB7wNeAZ5W38DcC9f5OQvEdHXv+7bKIuKGFqu2L/IN9z4Uu6/6y4+c9d9nF2dklK+UCaTFu89\nZZnJk5iIc53c0WRKinmGYasBg2M2qTm3eg5bOo5ecZC6rhkN84UtbI/CtjaPzO4T+B6FQZWkStPW\nCDLfvcfjlve/+5NwFJqH4YavOsLmhsn1oosudCMGYzvaokh3cjFP6HdqB+wUwxORjHJ1pxOAplyT\n+o8/s8b3PTSBVfijb1rgZ//DtYSOVNr/7yTgimJe4HQKTZNlVoui3DEuO5/O0WemAiIMrMlzKlHa\npqEYFAwHA5xYrHXE5JG2YfPMOcZVgdABFiYhajARjLW0waNSkkSwIrQhEEVp2kDEMBxUpBQQNYgY\nYoDSOVJsWFgY8u3f+3quu+4pVMNFJrOW02ubBGAaIoZEaS2JhogQokNMLkzaoiRGT0SIyRETOBU+\n8fGPcu7cOi/8xpcjMkR0TEqGaOG2W/+Yuz75ZyxGmPjzqBjuve/OxyRSXq6z3Az8har+hog4YAz8\nJHBWVd8iIj8G7FfVH+8mf70T+CqyMPiHgKdq3wDROcv33PSKDtJMjMqqq7orCcEU24gOavHe431O\n5I1TfFd4LIpcrfcxIkVB9InPP3iMlcMHSCmwf3kxQ7cdJSTFlDlV/akRsxpMDBGx+U1PKYdo3uew\nZTDIFPdnPKvml3/wDngD3PW7jm/4hsOPmqDPHWYneLAzad9hfU7T0052njrzMA6HauKGZ9f88S+f\nga+G2d8LN73iecSUcOW2swRNXTIfuw7I7Xkt/SahqrlOo8qgrIgoaoRC8npPnjhBVQ2gEKqyZFR0\ntQeBQpVzp05SNzWLi4sMDEgMgMm1IvqaUb7dv/dqClQstc+MCmPyBIAUt8EVQfHRUw1GFK7kO7/r\nu3jm9TdQVYts1TUb05rNukWNxadIVMUgiC3wSbFFAVpmhVBXUAcl+YSzBTE1BCp8zKIZXhP3fupv\nuPVj72NBtvCTgC3yBvXZO2//4lnHIrIMvFhVX999iAFYF5GbgJd0v3Yz8BHyqLz55C/gARHpJ399\nfOf/7UMU7wPBjWh8Rys3CRM8mrSrhwjGCoNu0Ezj63mI03dLMmuZtZGHHzpGUVgWFsYkPNPZFN9m\naLQsS8rhIDeFxYSKYk0OXcQqSXPCXDezLLgtuRe9bTNJ8aqjNdwPfAU88IDtLsBA7ueQORW+f21F\nUcwdpb//4jBLOqdIO/KbudxqviM3J6GcPl3APqABY6EaFoAjxDRP7qP3+TV1/z92DOOdTuu9pygK\nnMknZuy6Tl13ch9aWWE2ndHEmqbW3FnqHE6VshzgEwTjOLMxodLEodECZiBzKFxToqCbnGxyv1Ci\nxdiKQeEYFJa69ZCUjWYKyeC6acvWOnybQ+N3vvMdCMJrbnotz//KF3DNoSVqtZze3GJjs82nkWpX\ntbcYcSQy3E8QrCmQ0ZDgPdgKrT1lOaL1DQ/d+yk+/qHfZeBaorZZpZ/B/DR/LLucnOU64IyI/Cbw\nbHLD7Y9w6clfOx3jUSd/hRDmdJbWh6yCbwRni6wk2bYdDywrUXrf7WBdV2VhLSKamcJ1zbETqxSu\n4onXHsH7Gh9qxqMhoatmz2YztiYTmEwyIlaWjKoB3nc9MmVJ0zQU3ammmhAx8zBm5fBWdpYnwYMP\num6Hdo+g48D22PLt0yEnxP1J0g9JfSybJ/2AkAulz7rBZ63m58L9949JQUkEjCsQmIt2xJgnERdF\nQYB5ztWfKHNZKRFiG0Gg7Iipffg3GJbMNhuqgcthafAUA8uBA/t44MHPI1piNeCJPDTZZFzDwqgi\nBp95a6rYrj2gcHmAUN1OUOOQlJnQ0RpG4zH1tMV37RliLaFtaf2MwbDEOcMffegPec8fvIeXvvgl\nvPRl38Sh0QJH9h3m9Ll11gMkn2tC4HAyQJwQEhgpkOBJKbdzlKUyjVusnTrB5/7i/ex3NTHm2ZIm\nF8J4dOLRtl1OncUBzwN+RVWfB0zohq321o2UuFQ894jHnHPzOD2HDZJ5X0lJbcbqS2PzaIEUcM4A\nARcTJgRSO0NVefj4aU6cOM6oUq45uoKV3JBUDUYY4zBkSnthHeNqRGkKKltChOOnT7G2ucGknjGt\n645eY7uL3ZPUg0SQeMHJ8uCD5gJnUMi7XPfVN0a18zkmmcZjsKC2a87agZaZ/KVy4Rsl9PXOxDOv\nn84laO+9Z4EQtSswxiwO0W5z4nqH6EmVfcgqeaEYBCuGwmVnUB9JMWtrTScNaEFByWxj0s1fbPnO\n17yEN/3Q92XOWWwIGglJ0FgwC8KZ1SnTNuHF0lqIkjchHyJGhKooMSmLFYYUkKQUqhzaN2bfYsW+\nhZLKJcRFisqxubVFaDxhMqEqhFtu+Qg/9VM/wTtu/nXWzj7MvhE89ehhrjm8wrhapCjHYDJ44YyD\nmDBqqZziLGhaol1d546//APC1rlcvDWBylmGZYUxgm+bL+gIX8iOAcdU9W+7n38PeDNw8v9l8tft\nn7sbaywxJQ4uL3P44IGMUinY7kP33mPKrJo/nU7zUV04rBEms8TqyXOcPTvh0OGD7D8wzgTM1Gn7\npkTtPUal44k5VMwcSXPOcWD//vlFFWJgfX3WtSoPc9yOzC/8o0/w2VleB5//mLugTvJolJXeLn5c\n5pXKi+BimScr3YnSJewoSOKGZ8cMsbwa7r93gZR0zjjuw7Dc72PmrIQ+N1HtmAjd0/WOngCNmTzZ\nh5IikomXZcFkEjl3/jxf/VU38PCp87zjzT+JxszT05RybiKAOsQJtQ+EyYQBBaMFg3Uj1FQknxVU\nhkVJ6z2zusa4TPuZTTNdCYSqcAyqDBNbHWA00cxmSIhYV1EUBZ/5zKe549N3cuTIEX7wX/wQw6Ul\nrllZZGMaWN2yFIVl1ibUGMAgdkgbA9ONk3z8lj/l2P2fY2wz26Gyjslsi42tjb56dEm7nPksJ0Xk\nIRF5qqreQ57J8tnu6/V8kZO/nvSEq1ABK4aqGMzh3KZjAo9Go/kHJ12xq209i+MBoTVsbM5YXZuw\ncvggy/tGqDZo7HpRBDCWwhQ5edaYk3mX4/e+HlFgCKoYyegaQzentkynq4iY+YyXq442O3IWM4eV\nL77o+wv+wh+zYER/ABu7XYe52OHyd+3CAjMPjW64oc2qMz8Dn/uTwfz3+5pRURRI3HbgPm/qf3bO\n5VO7L7b6bd3jbZ3oHK5FTUQD433LTNbW+dhf3c4nby9InX4b9HmYyYiigKjNdJeQOzWfds0KR1cO\n8JG/u5PkqkwMjh5rEuNBSRPyZ+FEMr+s3yQ1c9KGwxEhRUJMtDExm04xhcNag4hy5swpfvZnf5ql\n5WVe/33/kv0HV9h3YIXN4DkXG7ZaIeoA1OBnU+7+9C0cu/tWBibgkqCiiCpLg4qFwQraUZROnj7F\nY9nl1ll+CPgtESnJl8wbyAHeFz35iyLHskkjNsb5BTwc5A7J6XTSCVq0tF2SXlUVPgU2t1rOnF3n\n0OErOHCgoqkD1gy6FtrIYFAQU6Sd1iyMxnjvcxEO33VVapfkCi7mMCGpElKcV/gXFhaJMWUuWbvB\nyrKHkxCPwkMPbUevFxcZH3mfohfBw9rBzX0+8cjCZZ/d5P9x6GDiSJngHNTXCMceGpNrMNt/F0K4\nYMxb/1gfivleG6BbQ9u2XVKd86ukfa9PIqoSrEJUTDEghJaNmcekiNEI2gMa21mYKiiGRMIMCg4v\nCy977gqv/95/zc0fuIOP/vWthGRoplMEGFSOEMmDl1JG74wt0dYTJTtS4Ry2KCkUTBHZ3NxATKJw\nY6AgCZw+d55feOsvsLx/kX/+nd/N0a94KkuHlzm9VrMxbVnbmvHg3Xfyib/8EAtmhpFIIUOMc7Rt\nDQi+mRE151uXsssdwPopMhR8sX3Rk782zq9ijGW0MCZKJLQto8EQSYFyXGWRhXrKwFrq2TRDitWQ\nY8dPsLq2yhVXrDBeKEgh0MsshKh53LXvk1XL5tYUVaX1MzAtkMmTIoKKm6NDueejAGdpfItv8/iB\nKIl/cv0Q83ngajh1zhF8hZLlenpEvK9hqCqoxXQXImq4MEoTNGV0ykgPDMh25b/LWzRrrJIK5dk3\n1Bck99Hn/5+fS0kxOwVGMXbbAYLPa/FtwvvQIXT96WMwkvMr6fKIHnIGUC/dWpU2JJrkGTvbjRG0\n9H6iPdtZI84AanA2MSzHaBRCvc7Tn3qUf/b9P89td53kbb/0a7QbM1of0BgZWpglRW2B94rtWRaq\npJjQkIGLpcIx2rdEHROaLG3r8TEjlYOBsLF2nv/2P3+N4WDMa17zOq699kkU05bzp9b4+B+/i3Hc\nwghYYwmphrbXgfBU5QAfzQWijY9mu6iif4i2bZnNZsSiILQeURgPhyQRxBqKwZDCCsPFZc6vbbJ6\n5hzrG5scPHiQajDsCJQBYyx1E+YJei9+0V+AGWo2hK6GIl1NQVWoqmHu/qtGtE2T5ws6R9Pmbsty\nMOTIkXPzEOz48cEc6oXtnKRHogCs2EelvVx8guy8fw49m+2fVUFj4Otf7OEvgRfAPXeN56dCHwaa\nDqItygLfBoIkrHVAvACI2AYcMpG1CX6OkqUEIaT570uHDGVxQksdhGntGZYuo09KV0tR0LxdRQyW\nSGWy8o0bjDl2Zp1pG4izDb7q+dfz0j/8Pe6549P8zJt/msmZNZCCqrBEn3AiRLP9nvavr1cb9W1L\nWVWghsIOsti491jJc0NnTUsInre//TdRVcpyyPmZpzIRxWcqTMohfVmWFxSHvxBsDLvIOo4+MKwG\nrBxaYTQazWPrWdMwbVrOb2wyaz0+JNY2JkzrhlOnT3P10Ws4sP8KhIIYoNVEEge2IF4U7G1PvUpI\n31gW84DVGBTfRupZm4XffMAqxDagIVLZgoErMao84WiYO8vDD1edlhnAjkr7DsfcWfi7mPvVh1+P\n9nOWKNrOYURAVHnVq5oMq7wabv2rffO6SY90QQcVhziXQQohzp2qH9vRc+FiiLQx4FMiKrQxdoAB\nncPLPCfrf7+0FcZYmhhpUiQZCKpEQ2aI9whcEpJYioUV1lngoU2hKkfEGjQEkgpPvPZJ/Pu3vIWn\nPf95bCl455ChzcNWd2wozjqMGAaDwfbJHQKWROWyOw/KAieGYVWyMB4iBpp2llvKm5olpyy4fLqr\n6rw2tzNU7je5SwE1sKuCFXnHmEwn2MKxvJwnWakKDx87zuLictYSroTzq2t4H3jyk5+G9y2Nz/36\ns7oGa+aQrUaP7V5v/+YCJMlNYrYosR0kJKoUVZX7vUsHMSLiCFEpymIOwc5mM648YjKccQTOn6vy\nJGHJRMzsiD25c9s5uejEUHJ9SJD5Bd4XVy9AzHY4l4jwvOdFjqx5WIPNZ1j+7kfHGJNbn1NSMIJv\nslMYZ0kh0M9h6eHr/sTy0RI1oBGME0Lq5tqoEoNkxTARQtgesJpZBUpslUjO7QZVyXQ6YzAY4JPH\nOTNH1wQlqOW+tZb9jeOqJ34F99x9Pz//X3+dex48QSwXaDYmpA7kaFUxMVC43HJgQsQaujbp3NOU\nUofAxdyflC/8iGjsWqxzU10TPM44lhdGhC401eDZ8go2t10XRUWMuSDd04Kyrpx0n+Nj2645S1VV\nmQCIMJvOcC5L8RRFwROuuZq2CdS15+SZNWIMXLFymFnTIiZXt6dNjSt6UqAiJBIQFLTfGaW7CJVc\n9jZ5ZvyotPNcI6ZAqJtMsoxZGMNFx+J4geADVgyr5wwcBu6Dg89MjKqKOijbYxu2k1wA42y3NiXE\nuE227HYw2Ea9EpplYFNXNNxBfbHWctO3tXka13fAx27ZR4iKSOjyBANiSV2ik0UcEm1XBa+7sMq3\nHkSIus04MEkzXcbl57NJ8pCmTknemm2OWgwJcQarubfd+4RxJXUb8DHiyGwMQyKEyGC4j1vvfIAP\n/tXfYK2hKPfnOTtYKmugGGTWoGpmaAhEFcRarDOEtu6cI3V0fwOdnJPpNqiMxXVF6u4zGFqb1Vus\nwWuCQpGiwFqytFXY3kBtBgCJMNfC5tIHy+62FWtXwCuKYj7Fq/EBNY4oho3JDFcWrFxxBVJ0feeS\nCXjOleQNfLsWAszrDH0OkXdo5m+Sc9lRQohzEMAaISVIyVAWI1A7p/QXRcH5s4NcOToGh1Zalg8c\nIGRq8vz06EMj2IaDd0LL2Wnz46ZTsN/JaO5zqJ3UGJHEt75ymp3ldfBnH1zGJyGooVUl9DWilKHg\nWdMybQMzH5m2nqYNtD6ixpIwORTtcqGUsop88BB8roSrV3xUokqH4BVoEqxx8zU65+Yb24te9CKs\nGAajRYpyQDEaUY0X+PzZNU5vBUK5Hy0PkMgtzVGFumnxKlA4cI6AEJLJp5ZY6gjleAlTjmiNwSIU\nxmK044J1MH+PMPbvVexQTWPIoiWatZ1NShQOhqVjaTRmUBQ46UIvI3kz60PgL+Asu9hWnF9gWZaI\nNfP7DBYfE+fXNlhc3g8CtnA5yBEh+oQ1FjR1iWneCXcWCfsaQl+c64/Xpm1xRlDyRZbZzA4lZQkj\nW9DDtdYaptMpzjlOnrTwKuAYPOVpm7zph49z9vxG97/BFUpRKEWRr4GihEG1gbUpP+bIjxWKc/n3\nnFPKsr/N9t/v+B0LGYBfg43rDZ+4bTmPvFOIMRDaQJ+oqSptyv0l/ankdyJxCtJBvmhmHDhXdfCv\nsm9hH2sb61hbETRiElhhXnjcuSn0G9Htt9/eMTG6kwGLK7Y3JpJBcLQ6xRUFrW8xMfcxKW2HfDla\n73HOZnIlwsxHTEqkttNlbj1FWTKPseN2rti/dhFBU5tFNMSCyc/SO4Ald9kKubzQHxOFc/gQujrY\nP9Ie/L5QFkKgKEqM5AE6SeH4qVPUdcv+5YN4jbkG0nOX6D6clGe9J01dXSKRhXcNpigy5h9jzj4l\nEZOncBYrgpAp9M45zq6usbQwwjmlaaa5eckIdRsR44gYzp4dZh7C3bDyVs/3cmLHCwE80Hbf/WP8\nfBm/85EJvJQd9ylQAD8Jt9xygM0Nj4hDNXbJuOAbjy0sGhMR0/XG5PeHntHbAx2a0SsxueWZHkRI\n8PD5M9kRrMWosDWbMR6NcquEMXPSaS5aKhhLPZthncufheRWBFWHdJQaQfAaEakITYKipG5bjEs5\nJHIVrjQU5SAr7UjCuIpkFJ8aRjODWMf6pOZKV6DJo6V24EZB1mdMFK4ETXRd5R0JWrO2s8niHM4q\nkiIxBgqx+JTAGETzfEqsEOI/4gS/txTzzEER4cTxUxw4eIjFxUVmsxmFyRrGOexKoDKnrIBgbImI\nouoRk+saqjnxFTWIM5RlQdNmiLPr60LFEtRyfmOT8cIiMSSMcXgMkiB0GL5FOHW24sG64ok/11w4\nxbe3Aii778UX//NH/jO89N9t/5wEQhAefKDi13/kMELeBSEnrzEAtqBtI9LR3nvqpunyth5g6MOM\nC0LFuM0AsF21PxccYVbXjEbj+d9sI3Qyh86dy3Uqay6sT0hH2e+dlZRDz0nbMh4v5EShFKrBgDAQ\nkiiuKBgMC6646mqe/7zn8pXPvYH/9H3/Cm09q9MZR5dGJBVsBLFZHw4pUAETIyqK2gpBcJmFR5Z8\nyuFWJlTmkxIUMZlRUEgupKbGU7ryktfprjlLn08oGSNHI2fOnKWqhrlHfpqLidInmOTxBr3o3Zx2\nnugapvIbFGNu5iJ1iVsI1G1NWQ5oGt8hUlkuCMlUcrF54Gt/32SyhbWGpBEJOUx50w89jZe+dJVq\nFBGxND6wtbWVq91b0Lbgg+A9oCVNo3ifL/a2Be/zYzEKTZsIweTHfd7RvDecPLHJze/bhyYhzBJX\nHjmCSXF+qqbkiWq794AuhOqmDyclae7zKJwjyXZo2jtNbzuha6R7/2JHag1Zl2vn3+WYfvtkkc4p\ne+KmdpoCvWO23almurDNdmFfuTAiVRWihmpQsrRvH9dcfw0vu/FlXHXVUSZ4ZlEJyTNYWs69LzHO\nUavSWEzKIym8JqwoYoWqKpnVM6JRTDRYU+LEEk2LJogph9kxbm8eOeQKiOkkkDSHfpeyXT1ZYoxY\nZwHDyZNnePKTn8za+mRePEsp4aTqwo7tZG4+ycs5vI+5KhvzZNt5sgY03ncbnFD7lhCBmKkuidyk\nhFiMG1C3uce8qRuG42WaZkpICWsz0rK2tsB73pPVXHJMbrnjzs8hO6q+/QVYFgMeAQmzgwdmHjn8\nVESoa4OdFcTgWRqPM5LV5V99v0hMSkydOkxeDYpiXYZatRs5aCRLsu5sQrugQp/3ky60zaPyeoqN\n7gAZ+vezL/b2f0vXReqco575rihq5iePAGJtDuESWa41eF77T7+bu+65k9d8z7fzhKuPMJnNOLR8\ngMFwiJ+sU4eAsY6zZ04xKkqElBN1Y0gmNyxgyBucxu4acewbDzjohFqFII5pSMRkCZIIRjNlJpnt\n/Ebj/DWpJtQqbawveb3u2nyWx/1J92zPLtP0/6WteM/2bM92sc6yZ3v2/5vtOcue7dll2uPuLCLy\nChG5S0TulawK8+V+vt8QkVMicseO+w6IyAdF5B4R+VPJUk/9Y2/u1naXiNz4JVzHE0TkwyLyWRH5\njIi8aRfXMhCRW0XkdhG5U0R+drfWsuP/WxG5TUT+cLfX8pjWIzSPxxe5KH0fcC25mnA78PQv83O+\nGHgucMeO+94C/Nvu9o8BP9fdvr5bU9Gt8T7AfInWcSXwnO72AnA38PTdWEv3/0fdd0cWGPm63VpL\n9xz/hqwi/Qe79Rl9oa/H+2R5AXCfqj6gWSrpd8jSSV82U9VbgNWL7r6JLN9E9/3V3e25jJOqPkD+\nIF7wJVrHSVW9vbu9BXyO3Hb9uK+lW8O0u1mSN7HV3VqLiFwNvJKsMtAjUbuylkvZ4+0sR4GHdvz8\nqDJJj4NdSsbp2I7f+7KsT0SuJZ92t+7WWkTEiMjt3XN+WFU/u1trAX4R+FG4QDViVz+jR7PH21n+\n0eHUms/2S63rS7pmEVkAfh/4YVW9gDzzeK5FVZOqPofMevt6EfmG3ViLiHwrcFpVb+MxeL+P92f0\nWPZ4O8vFMkkd8f1xt1MiciWAfBEyTl+siUhBdpR3qGqvhrMra+lNVdeB/wM8f5fW8jXATSLyD8Bv\nA98oIu/YpbVc2h6PxGhHEufIDbrXkmPlL3uC3z3vtTwywf+x7vaP88jksQSu69YqX6I1CPB24Bcv\nun831nII2NfdHgIfBV62G2u5aF0vAf5wt96XL7i+x+NJLnpDvoWMBN0HvPlxeL7fBo6TCfEPkWWc\nDpAFy+8B/rS/cLrf/4lubXcB3/wlXMfXkWPy24Hbuq9X7NJankXWi7kd+DTwo939j/taLlrXS9hG\nw3Z1LY/2tUd32bM9u0zbq+Dv2Z5dpu05y57t2WXanrPs2Z5dpu05y57t2WXanrPs2Z5dpu05y57t\n2WXanrPs2Z5dpu05y57t2WXa/wVjO2xV5Ti03wAAAABJRU5ErkJggg==\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# randomly sample one ref\n", + "ref_ids = refer.getRefIds()\n", + "ref_id = ref_ids[np.random.randint(0, len(ref_ids))]\n", + "ref = refer.Refs[ref_id]\n", + "print 'ref_id [%s] (ann_id [%s])' % (ref_id, refer.refToAnn[ref_id]['id'])\n", + "# show the segmentation of the referred object\n", + "plt.figure()\n", + "refer.showRef(ref, seg_box='seg')\n", + "plt.show()" + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "1. woman in front\n", + "2. lady smiling\n", + "3. woman\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAMsAAAEACAYAAAAdo4LwAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzsvXmwbfl13/X5DXs6853vm3t46m53a+62rUiOsJzBdrCd\nIgG7nOAUwYGKIYFQScoCEhNMEopUTKVIhYKCGAwBh5AASYwdWx5ky5Yta2hN3erhdfeb3313OPMe\nfxN/7HPvu+9JaquC5XZSb1Xduufs4bf3OXt9f2ut71rrd0QIgYfyUB7Kby3yrb6Bh/JQ/kWRh2B5\nKA/la5SHYHkoD+VrlIdgeSgP5WuUh2B5KA/la5SHYHkoD+VrlK8LWIQQ3yGEeEkI8aoQ4oe/Htd4\nKA/ld1rEb3eeRQihgJeB3w/cAj4JfH8I4Uu/rRd6KA/ld1i+Hpblm4ArIYSrIQQD/H3gD38drvNQ\nHsrvqHw9wHIOuHHq/c3VtofyUP6Flq8HWB7WzzyUfylFfx3GvAVcOPX+Aq11OREhxENAPZTftRJC\nEF9p+9cDLJ8C3iaEeAS4DXwf8P0PHvSP/psfZTQYEIeAyNbY7ics84JxHPHs5UcJk2uMD+8greCo\nDKxvn2UyPkQT2D13lsPxEZeeejvPv3yVp77hcfTwLCqLSImpqgmmmONDRGfYJ4RACIp/9k//Lz74\n/veTximdXo+yqfnrf/O/5S//R/8ulW2QUUQ+XeIJHN7ZI4s8wXr29+cMBiNkFDFYH5H0+vzVv/V3\n+OVf/CUMEiE1CLW6TjsPCCFOXp/eBhCkuG/78f/J0RGj9XWEECfHSi8Aj/QWXzc8lsUM4oBVEddL\nw0zHeN0htg6EAykIArwAKe85Dt4HpJAELEIEQhAo1Mn13Uo9pJQQAod7d9nY3Wnv+9S9txI47UC0\nnM5X/sz3jhEn/4+PO/3/tJzsQwCSg9u32DxzFghEkQLnUQiED/d9V1avvjOlwRuUipE+EJYz3j8c\n4jsdPjWfE9IeOorwp64tpEdK+NjPfORBVT2R33awhBCsEOLPAD8LKODvfiUm7Lv/zT/O1RdfYntr\niFcJPi/Z6W+hD/f50ivX6EvHbN7FLO6Se8FBOCQsSzLtyK/dZnN7m9oGpIRuNiDTGWVT4FVNHATd\nJMP4QKhqnLOkcYd+0kMESbksCD7gvATrsHmNL0uMM2SZJk4HJBsjimrB1sUdtrYcPlIoKYiERCv4\n8A9+H3/hT/8gf/6v/DVe+tLroCXBB3wAgm8fmmj/rG8fqA+eACcPSUhBcL5VRsQppWwVJQiwUhAH\n0KZhu5OxKaEWgmuFI1cxImgEFqc8CggEWvW+p3TGGJTUq833FNSfup4SAkJ7NqcUezXQ6eeLlOJk\nHLin9KeBcho4QhyDtr0G92FDQBCr+4bgA0Le+0b8AxON9x4lBSGsYogQQEoEoG0gSIEOHisEBI8L\nkm4Ma1rzxTzH6g4xEhscWkYIsfrOpMR596a6/fWwLIQQfgb4mTc7prh9k/3XrzCwF7Am4IJnTVo2\nMZx/5nGqWcFwrUSN3s5GbwOxu83RZ1/i5tXPUhvHbJlT7t/G2ppf/83fYJbXhCpHxpIgYoZZhBOB\nJIpZ664xXNtgbXOTwcY20loMAkwFBKraU0xz+h1PfbRk9I7HmB9dI1WeYjzHC41CsJgvqPKc9dEa\n9XhK6e7yoW9+mr/91/4SP/Tn/xNeu76HVymx8Fhr8c61iqclIQgIstUNAghBAJQUSL9SMDg1owJB\nIK3FNksudhPOqsBCSm5UnrlWhEgiCEgREFJiT41xrI8hBJRqYYQICCQEjRABIfzJMQJ1ArPj8090\n+gFrcLznQYtw73Bx3z6BumdphCAEf29GX1m3e9cK7bRxgvlw/5jB4YVACIikxHuP9Y4gQSKQtN+D\nwyOBgGArjhFKMC4sxD2c9HghcPjVdQRBqNUk8NXl6wKWr0UaI9jYvUjS76HriiztUE0WbG6vUSwm\ndKTGawjGM755i/W1PtYUdKRF0WCnS2iWXN45z97BPmfWhwwHW9RBsvv0c3z+C5/nXe9+ClkbKmep\nZgtkU/CJX/kI86MJQmuiWGPrnJu3rxEjqJ1gMp4weMoxWSy4fPkJ8rrGm4BC0xtpsqyLqWvSXhdC\nYH17i8xW/MTf+lFuHi75U3/2L7KsJVrHWB3jvIBQ31M2QK2UIYRw72GFQJZ1WtUIFrWabSPrONfP\nWBMWEyqu15Iy6uKFQAnwBORKscTKMpy4cCs3zPtWOY/f33OR7ldqKeWJhch63S9zH0+7UMfvv1Ke\n7vTxp/crqXDBtQBejdXem7zvXO/9yb0oKci6fYRvLZI8NaZ1HqkUQYQTl40gCQSEjKl0oDsvOTfq\nsxCanICOJEJKPL51TVULqSDA/xY6+5aB5UtvvMbd63dYf99zWKUxjWHQG1AsGpq6oZNqJosFfaUI\nVc6Vz/4miY1I45TRYI1gDFop8umUOAQi4RAukAqJXc6opreJykeQWUZ9MGaQSh4/twXW4C7uUjsL\nIuJD3/Nv0Cz2mVx5GWcN73j67RByHr34OEJFVM2SWVVRH+ZceeNVdntr7M8nfON7nuHaa1e5fOEC\ncdZlPJny+IVzfPT/+Ul+/tee54f/0l9GpX2iJMVYhbX23uzoPF6ACCBOzdJp1sUL1876xoBxXMp6\nrAmDkylXcss8kniZooVrwSJaMBACUqkvU+ZjoJxW9HsKfDruuF/Bu/3+fduP5avFYcfunpTyyyzL\nsUVyfuWenorZhJD33bP391RWeMB61vpDgm/9LodH0I4lhMQFh1CreNEFbLAIoRDOAoJOYxhGKS8t\nc2odEQuLRK+sbfsMghB4AuLNDctbB5Z3vuvtJO9+F8nOFuiYICRhvuTaq2/w2ouvs7OxQVksiQ7H\nSB0TRgP27hxybruLkjFp2qeTdTB1QV/EzMqc2ig2L17ECkUWQ7Gc00ET6Q5BWZI0IvhAvSxIg6eW\nAVtMmF2/Cc0MoWJuHl6lV+9jCk2sNJGGkfOYQUamOzzxzFM8JSu0jnn/+57DaonUGbujEdPxEcti\nhvYL/sZf/xE++qu/yT/9Zz9P0h2gIoU1DUq1D0f5cBLDIAJCqnam8+3MG2zDuV5CVxlCiLm6rDjI\nUqTwaGlpHTDPsW49OOsDWNugVHSPWAiBEO7N2qeV47Tyn7Yw7XwrTxT+RKmPXb7jv9V5x/+PQboa\nfTU2qxjhlHUL9/afDvbvxUYBjycQkEq3AFpNMu7kM69iKCnwQSKCb10yD5uRRiYZ1yZTRJYSVu6v\n8KENKJFIEXAEIvHmmZS3DCxKGExTkr96SNrtURuPTjtcfPwRojhilEZI5elYw95syvnLT1C8rURp\n186iSYo1DZ/75V/h4tlHuDY9YHl0SPXZz3HhbY8xf+MqP/9Tv0KcrfNd3/e9TI9uY0xDVRQMki7D\nTsrmZkaIFC+8eo0zHcfGKKXOS7pxH5tZ8nqBbQTOebZ3ziN6KeNixqa2+Njj3JJ8WZF0hxzM53Ti\nBBEsh+Pb/IHv+G7e+43v5j/9kQ/z0Y/+Kv/5j/5VUBqBwIqAjxQhSMTKhQohgHRgNaquOZtmnAsN\nQmdcWVbMUk3sKohThPArl0EiBPcp5pdbAsfpiPpYCZ2zD1gAt1L2Y0U+DaZwXxB++jrHoFIrq3a8\n7T7LgsP7+8990CV8ECjHhIhQ9wgB7z0BgVzFLMdWSK6YXk9AyJYkCFIirOHMaJ1xbTFKo4JqGcDj\nuMsfjytQCjh1j19J3jKwxFFGXizQUUzwFmkN11++zlPPPsf61giWS5pqSRwCnViwPLqFNobJnbt0\n1oakvR7G1Eiv2Vrr8shjv5dOt8dsfMi1Gze4vnSMZ5a9l17hz/3wO5CXn8TJBL9K8Xz2s88z7Ebs\nXb1JnA3IE8vdw7vQGK5PZlhTo3RGvqx45pl389rNG4S64NFzl6gXR3hbYawhyzIQgs3BCCdaJVvM\nxiQakBaWB3zre57gvX/vx/lf/+E/4Sf/4T8iTrpUIbQWRkowlki1imuait1Ys6sUnVhzJa+4qxVK\nx3QaQSllqyyyJQ2890RRhHPuy4By4mqsLMKxgjrnTvZ9Nar7/lgk0KrK/Qr/4LkPul6nXz8Y9xwD\n/PQYXx4DtWTAaTB6/+Y0NLSWOzjHsDZ044RPj/chSlbHr/afEA4tuKSFr5xduSdvGVgO5wuSOGE2\nmZD2MkaDDd7W2aFc1uzdvs5Or0c3SbDGoyKJMQ1NUzPa3qZYLHE0VHnBoNsnBEcWaW7f2OO1l7/A\n5OAIQgoyojQG5wwOh0odSktKa3jync+gJ1fpRYG1Z56CYJGhwmjL2tknuPXC5xmmCahAXdecvfRe\nnPscn/z0Fyiqgjo/oPENiUxRUcKZ7R2cdWyeP0cajUh0j8V4zMbWBo2zbMXw/d/z+/jub3sfP/Jf\n/m3uHB1Q4dEiIIOlq2Ctn2KkYcN6srTDS2XBzUiShgyBpE5aqyBO3PpVLODcKdfqFAu1cjl8CMj7\n3Jt77sbp16cV9ctm/lPHfKUA/ivFMMf/nXMnr78agxZCaGlhpe67jwcp6NZS3r/vvs8gQLjW3dxN\nU2ItORQeIdXJcQ+CWwjRUvy/RfHJWwaWNO7yiU9+mue++X0sqxozMyyWc+gq6iC5Op6QaQUIjLX4\n4FkeTekljl5nncPZId31EaSC/vYZXrt6m8nRgspEjBcNprEs8ilxoinKJVlXIW3UmnGtSbOE6tYc\n7y1GO0Y0RFmHSbnALg/odhOWiwmYKd1YEK7NuNAXlLEgynbIom16lx7jJ/+3n+N7v/sPEvd7WBEh\nVZcP2sCnP/lJHnnkIq+98hK3JvuoIEijmE6S8eE/+yf5/Ks3+Omf+qdc2Fljs6uIfMXeeMHycMkw\ny3ijLrgRoKs0QXiCCsggUCIgFLBSmnDMsnqHkR0CAeUtAgMhWrlS4oSFFUIiaRXDr2KCAEjUKddG\nIKXi2E9pCQRxj0IWAedWwbwUBH86R/RAkP4gsbCyKkq1VDXyFH0txYrJunfusXjvcTRIDc6vPlPg\nAWBppHdUsaZb1pwdZhyWNTaO6QSBl2BDgOAR8n7r9uBYX0neMrDY5QGPnt0klEvObO4SeQW7miYE\nyt4CLUE5i1cOHwJFWVFUJReffZpuZx2RxmgBu2XFlc+9wAvPv8wyzynKAhUEd27fINbtTFQUOVHc\nIbiSaNAjUxFBwSJoMp3grWBSLRjEPYxJCaFLM/8CSVkw6scsF0csa02UDsjQmNmUkpq9wyO6Q4md\nj3HLOSaOmU9Kyukhb3/iUSItWU/WufiOp3j9iy+gpUIS6KaaP/Kd38qf+Ne+nb/6V/4z0kSwmE35\nhndepLz0Nj7zic+QjYasLyoKdEuXqjZH44M4sRhBtEk7GQJe+DZ/EVYgCIpVYv1e7mSlrDZ4pJDt\n+1VyMZxS0GNb8uVxRHuMd/czbM579MoinLYexwA5vc17j9b6AQLgfnnQWpymwqUE674869/esQOh\n0MEzJNBLe3zyxk3seo9wusrgK14n3Jd8/UryFlqWwJ0br/PopSdQFkIqaQSkMkKGHqJpiFTEfHpA\nkILN9RHj8QAmE5pCkPUG1KZE64hO1KObZWSdiLrpMjkYM51PiOIMFdV85Bc+wge+5ZvZWN9A6R4o\nyXKxZH37LNbU/NLPfpRzw5j9518gtyVrwzPs336dSxfPsO0zlN7m4pkBtTVEWnO4P6fbXycJgWAm\nOCGpFjlFOWbjwnl66QWqfIkKEXVZMgqBRMWsDYYsJmO2N9bIm4Kjo1v8Oz/w/bx++3Xe8U1v57Of\ne56f/bnnCaMBa90N3ntGcPXOGCcDk8Uc5xzWCJTWICTGOtI0wZqaWEiENIDEhZbB8iu6WimFVOCC\nRazKWQCkUPfcuCgC7im49/bEtWtLYCQBd5/bdazwx67Tgy7Z6eAfODn2tOU5ltM5oWNW7TTYjqU9\n9zgHcxzHtHFNEG2wHgXLjgzkrmaiU2KpV+UUX553srb9nN455O9WNmyeLzhz4TymAWsLks4anSzF\nGYtUjiiVzI4OGHZ6uEhyZ++QqjFU1QyV9ShdQVCC53/t48iQkUYJXkmMNVhX0+mklI2nyEv++7/z\n3/ET//P/CFrgLYDHNYb//X/4r1nfPUuWwcVLZ3nX+5+lKAuaIjA5M2Jj2CdOEhZlwTTt8tqVK7zz\n2Xdz5vIO1WLChlS8zfV5+QtXefXGdbqjLr29uwQVs55lOFNz8fwZzDwn7WZ459k5ew7rDcVygZIa\nb+bs7uzw4pUbfPzXX+WDz30A+U1dbBAIUYOVdPsDClODh2XZEMUxSkh0rJgvF6ytbWDqCidh7+6Y\no+mCg+mc6eyIw70D8nxBWdfUpiaSAhU8hVCE0AJFRxGmrAhS0Ov1iOKYgCR4sKYBFEqHFaEgVwAL\n91gt38ZFLQt8T7l9CCsi4sHyl7aUx3mPEvKEhm4th7wH4NVrVvv8KZYuBO6zWiF4pBcEJMp7doZ9\nDkxF6PaIrCVE98As1T1qXEqJ8x6pxKrM5qvLWwaW/vZjdEwgG6QU+YLleA/rLB7FG9du8sTbHqOJ\nJUXTsJguiIRkZ3sX42fMZkvqcs71N24SOUEIBc7VeCeYT3LW13a4ru/gmooQGtbXBkglWBQ1OtL0\nez2+4cknCY3jjdde4fFHdlgfjqimC7xQ4KaM+oosFczmB8ynM7JijfHtI8LjSw5uXKc76DObzHik\n1+OFa/t88Lm30x/0mc/niNEuzXzK5to5Xvj8Z3jHc88yGI0ItSO3HivAS8mnn79CuZwQRQKjIp54\n4j3IpIPwEZvDET7ULCZzqrwgVhpT1iQKIhkYjfooPN0sZm9/Qk8LBpGELNCLYs7vbtGX5wlNTjcV\njCdT3v97P8BWFphVnp/62Cf5w9/17Rzu73Nnf8wHvutPEvd7bYbdWP7Oj/1X2EXOYn7AtJiT5w3z\nomZR1izLksoGbNO01keqVY1WBEpinQffsnHWWbQStIWmkiAMQji0j1FO4IVv6XMlwAWMkK2LKFVL\nHYtTJTxeEPwqSJcC5QLCOYIQiCAQEipt6BU1Ktrmjf0lZVKQZjGaQMC34F1VPQR9HPADWlCFt6A2\n7GsRn0+IlKae1pSLBd1eD5MvyYbbFPM5KjgGnZRoKFjvBILWBNtwtO9ZljOuvPw6RWnJq5punOLq\nCh881jiu7+8zHAyR2qMv7RACKK3QWYo1hllegKy58urLXHzqbdy4dYPNnbN0N9fp6C5mMUBWCw7u\n3kXKhO2zFzkYTxFCMJ4uSGNFnERsjYY0VcPWxgiNZXp0B6ElV195kacvP8Jk/xaXzu1QzA5RKiLP\nS+IoozIN1ngund/hiXd+iMZD00i6SYcvvfwq83nN0jqMc7xy6zqDbq/N+rs2U++9Z7ZY0O90OJzN\n6Q3XmcyOcN0UFXcQIdDNEqTVyExgfcF3fvu3US+PsC7i05//EhfPXuQzv/RraFVgpCSSCuscUaoQ\nGgY0LKsJW1qQpQlxv0tlHI3z2ODbWElK+oMeTYB5vmQyz5kvCmbeUiwq8jzHW4MmAhFQRDRK4JF4\nGbBCILwCeRxTgbMGoVtiQgRO8j5uRTwI5Ande1IBt7IYtYLUGs4mPUxd44MlTeLW9WTlAhLw3qFX\n2/yqTu0YQG8mbxlYsijDBwuuJoo0WMvW1iaVlxTzGcE7mqLBKIetGrQLKATTgyV37+7j6opBmlEa\nRxLFREJSNhWR1iRqHeccgyrFb3tqU9E0htJUxIniT/+x76WfKRIVISLJxuXHELZhtncL13iM7HD1\nhS8h8XSGfbyYs33mLHcPPstjecUyr9mfFdiqZHNji8HOWVxT44zmaDxl8/zjFM7TGQzwzmCRxElG\n6jVNVbO+vgXCUS6PmB3s01vfJBaC5XKOC4asmxCCJM9rOv0eadYhUopYRzjXMknOO5CK7Z0z3B3P\n2Fpbp2gKTFXhveBwf4J3kEbQiR2ff+FlepmkbDyPXH6S2aSkbBS9bMB0WSBVANUmMbUSJNpT4DCu\nJoo8UhhsU6B1Rhq1SgYW6jm+MfSQhEiys7sBVqB2W5rOh4AY9qgbg7OBW7dv0zSGu0dHWB+ovUer\nCIFA64go0RjTtLkj6wktklowraocWqpXEGxLUAjZ1jNEAbrNgs2NbWaLCh9FLaHnPDKOILTw8sLj\nkVi3isuOUfK7Nc/yuVdeY304pKhKhNAMBwPuLA9YX9uk0+uioqilJL1EOItCc/3KVWbLBbauEF5Q\nlTUWqG2NNxZ1XG4RxwQJcZxR1zX9rEcAvu27voOOWVLd+hJmPMX1hmS9IUVhKYqGslpw4ZFHsJnC\nSc0z73yaepkTRZoQS/7IH/0e5vMpiRoQb26jlebW9Vts7ZxFK48LgjMSdG+b2dEhe3duonE0RUF+\n+5D+YAMpPEevv9H620pxuH+bTpKyce4Sk1lOFsfUhcVaw6CToRF0soymrlvLotp6MGMNKhY0TYP1\nnhDFYA1pJybWEWmaIozHVhWDXozWHTZ2d9h57FHKcsFFH6jqCxhT8ba1HdAagUMKjcdzsMwprMUf\nxybWIrKUygmyRFNXFm8trjQ0RU1AYAGlNUJ5nBAsFgsGgwEds0RaT10b3rY1IktSxOVHqa1D9boY\nbzHGUFUVh4VjPB6zWCyosNiVigoh2wptWkbwuIUhSIiFRwbPIxtDhrUiVZI9W0MCggihI8Kq1sYT\nUEERQsvgsert8cH+7k1Kbu6cZXP7LLfHE558+gkQGiEDy4MxH/zOP4BHUdc10hviTp+f/r//X7zx\nNAFwYJqARLNoaua2QDkHzpIlCVEnQoWAC4InnnqMXidFSUs+OUREHhclaNmj2+/hXEDr1g3Z2D1H\nXVVkSU2aKta2t5hWFa988Qu8/e1vZ7q4y7DXZbyYEdvAUsD+4QGPn13D2gJkQp7X6LxhLUugFxHH\nXSZqwp3JjEeSDi4TZGfOImSKrXPibqAT9aiOcl597VVeefkKxdJSVTVZR+GcYDFdkCQaS41AUFno\nZwnBObK0w9AajIc06dA0OU5YlPYIYUl1j26nS9KV9BJNPbmFF5JbNw6xxZzRqMv+nQPWHn0WITWS\ngHOB/mCdYrYg6yTUdUXwrUuUaEVZVgQbmI3nDLIOEs1wbUSDpzEGW7cA1lqjhGgtRBCtBVEQhMeH\nBoKlmi5wPhAnCVmqWEtiwuhsa1kkGANRlHI0XzKdT7mzv0dhArWPiKUmTQWP7axxeWuLiAX7LxeY\nxrK+MWCYRlzPc5BtiVAQoQWIF1RNTnCeyAukkqRdTZKlfOFNdPYtA0txdIv4kQu4o5LQ1ARpqJ2h\nsz7EWk8cx+gkwnvP6y+/xva5C7zx+nX6SrTJyxAg1KhlQXfQoTYNxBF5XZIYR1CSf//P/HsU80mb\nJe/EfOSnf4FOgKaa0ul3sXmOs4HD6YJIaNJbR6xvjOh1dzmY55STGcMzW7y3943M946IUAgkwcF0\nPGGwtc2dOzdYPnOZ/mATLxQ6rukTUec5nThjMZ8wTLtcOqMYpQkT3aBlS52LKONg7y7LZp9hb8DT\njz7K1Vde4+xjl6hrQxx5XOnw3lIWS6w3BB2TJRlVXZHEiv39PbLeoGXxyiX5cklelURKsbExoMzn\n7KoR67vnGO2eRagGZxXVoOFuOcUhmC1KcKGtclAKhSDSEmMswTR4LZBaUjc1UQShcQQCm9tbmLoh\n6aSUtmmZsBVVPOitqpZpraCUEZHW5GWOx1PXNb1eD+EkkYBYKFzjiJQiiiJMY8hUYC1VVM2YC33J\nbpzyts3LxJ0u82VJ0okoJxNiDJRH3JrNeO/7nuXm7bsUd/boKMOH3nEZIWO2zuzSNA2maRgMRsxn\nB+AstmpQAtJ+S2n/kzfR2bcMLBfOXeTWa2+gEBzd2iMbDVCJYjI+YtAZUVY5Ko7Yu3abm9f2yOIO\nKgjiJqcfeWQU4YIjFhLckq7ylPmcfpqxMeqQ9hIOXvw4EWCUIjSW915aI3hPlSdYBJUTrJ05y9vX\nN7FCsbE+oqpyur1tnnruWYKt8fkCffEMSb/H5379NxiMY155+WU6StNZ32c2m/ETf/fHQUVsbp0l\n7WX0Oz36Wcqw3+PuwR6XLz7Ordv7nH9qC60UEsHd228gvWQ6rtne7rCsp8xmc4bDHrYuiZVGikAc\nK4SOQTR4rynzhkkxYbQxxHvD2tqQKEmwQaI3+5jaMpsXeCcY9DS9zFKVEy5evkjodigWOb2sy/aj\nl9i8sIESmsFZTxCc9Ji44GiahkgrbG0JSlA7i1aKCEmadalMTV6WeCmIpSa4gKsNiY4QOsbWzXF0\njiNgTE2k2nyIB6TW1MbgG0OkNIvZnLW1NQyWOI6oqhIrY2ZFQ7ANg25G2s1oGke1nBD5wHJZorwj\nSTJUFLN7dp1r+weIOOLSpQtUTcGmioi6KdX4LsE7NFDVOToYIi1pfM3GaIRKIvK6elOdfcvAcrdq\ncLnjc1/8Er/nA89iy4ru2bMUuSUKS7T3XHl5n73DQ3rdAYt8SW99k2K/LRvPog4egYoaZIB6dsj5\nc+to7Ym7EelahyyKsXlBJ4uYm4pupGhsQzJI6GU95ospMGe6P2G0vsX8cIKUjtn0LkmUUNU1/Sxm\ndphjyfjCC1/kD77zOX7Ps+/BhTZPsLu9zqDfZTGbMBps0vRSGlLOnD+DTBRb4wVRkrCbJOxLONib\n8uKLn2K8d4NB3AGpiDoKLWEw2KReVvSHGUQpUmt00qBDIPguTktyc0gnjjG2YpD1CNbRiWKEEJQh\nQrDgFz72OfLK8wN/9AN0lcaKLuW8pKoaks6Ig8Mpvlqyf+M6xlT0t8+wdvm9ELWMlJStUuMDXghM\nY4iDQ2mNsRVWtPQr3pHGMVVeYZqGEAJJmhKcJShFXRtUFLXNbRLypiJOYtwKeMY7pBBY51CRZjqf\ntfuNa6lnJELEBKVpaqiFoW4aOonG1BU7wxFaRnjrV52ObrXWgGtdyc6A2juWkwlprIi0XLUge0pn\n6aUDUJJxlePnFhW/ORzeugx+1GF0qce5i2dZmoJ6ssTu7ZHtnMFWBddef4Nbe21OZRqPyfMc5z1G\nRzgfUMbuJlfPAAAgAElEQVRQ5jXBOhq/4Myoi17boCgLXr9zyMXuFjeOclIdoZJ16O0QdTWJihkf\nHtJkGcGk9Dt9yskh9bhhe2uNolwgopQQJEkcY40llpp8POHJxy+x2UmI17o0eIrJjK7WqNAwGnYx\nZU4UeT7zpZfYWuuzOFiQqA7GNmwO+kQqYXDpEdbWhlx+4o/z8vOf5dKjj9KE1r8PZcMLX3yRu0cT\nmmpJ2TTk+ZxgPd57auHpyQjvHEppJotDVKTRVU3TVCBS1pRhUVU0LuEoL9hIEjwwLxpG6Rp/4Yd/\nlB/7sf+CN75wlY31NQ4O9k6ax46z4AKwxtCUVdskFwu8dyg0cZIipGQ8nmCtQ9SObqeL6g+oqgrr\nHDJ4ghdIpds+fxfIi4IkSeCBeiwhOEk+ZlnW0sTOoZWmqWtk1KqocR6pFZ1OhvBtZYK3hsLUaB1j\nTEOSKaSApqnROmorumXLyFnnMLZCyTa4j6OEfLFE6rYKoLc2wPxuDfB91VD5I2QSMUg6qK11xkXJ\nqNPn7//Df8zFrQ20hUs7u+R5zu5gjbpuSPsjrHU4PItlyY3ZPj/4x34AWyxwFpx1eAkHBxM2NhKE\nF8waSdbf4tBVJCImjDTXjg7o9tY5zHO++OJLDDe2eOnaKyyKkpCMiIVDOUu/kzIa9AlNw3I+w+yc\nZzabkaQJ2WhI5AWT6ZjOcI04gsp5fF6xGE9RUVvL5BtDEqcILGWeM8o008M9NtYyJpMDBqM1JJ66\nyIlVYNhLMCbQy2Kmt+/y2PmLbSY7i2lMgykbvLEUyym+LKnGM6QS6K4iX4xxvmaaLymnS+40R6xt\ndDmcjnnjxg0+8MEP8vM/94s8eqbP0WJCEIKDoxmX5apQEvDB44NHaY0SrdvkaBN/zjqqoiCOO2Tp\nvUx4WZarui+F8J6qbpAqwliHJqBEaJvfhEZHEdZYrLWkOjqpBCjL8iQjb4yh0+m0C1BoTZ7n6Eiv\nikEFURShlCLSCSBREqQKJElCXdd45xARJ+MgA1p2SLXEVEvqxtAfjcjrEikFyyrH8rs0KZkmGc1y\nweZgA9ssSJMeNBU/849/Gu8EwQQSVvWySuC8IcsSrCtIkpi8avj9H/oAL7/xOnp5F2Esnd46y8WM\nYlnx2IUtTJMTBQ/zMVd/83nUcJsLZ3YQ1ZxRkaNm+0z2DvjGc5cY1yX94Rl8TzF6/Ek+9Zlf5/0f\n+BasrymmcwabW5x/17vJGoe2NaiIyY1bvPTiFzn/6CU+/qnPsbY2Ypj0mWN44+4tup0OV69+gefe\n+xRIjSkNcZKAcfjCkA23EHkO1oJsV1hR1iDLilRKrINqOQWzjdaSoiyJVUQv0Rhr6aQRla2pDsfs\nnjvXJgxtjaiX1Is5ZnpAFGm0k9T5nM21IR//tV/nT/1bfwIR5ty+OmYwWke4iOACwXuCkggi8sXy\nVBVuuz6Bo41ltFRYYxFRhDGmzYwL0TJhxiCCxZhA0omIoogotNS/lBIlJd46XN2Oo+Sqr8a1VK5U\nihDAmBLrHN4FrLUkSYLznsbUnN/dJYkVwXmqqqYsGrwXmKYkTVN6vR5Samzj2djYAjzGgwgO0yzx\npkKJhPls1pa4aIlSDlvXb6qzbxlYXnnjJge3r9N8qiKJJDjwKG6OS3r9IbUzJFjmh1OCEejNHY7K\nOVksefLpx1Bpggk5V19/kUfPfCOzfM5Gr0+qErLNPvu3boEM7Oxs0jQlTz12kV/61EuMUk8mLJud\nDOsDywX0RzFJSNEeHIoIy97NW9iqpJrPWO+tEaoFTZ2zqEpM3dBUjqST4TQM+inf8s7LBCnBBZ5+\n4luYTffppZrdx3bwoeaFl77EU7sbeDLiqEfWUSjt0YmimB1yODnk059/mWJ8SB0USdQlkUAcM3OG\nXqdLFmckGHxd05gly8UctCAZDMllQEpNaWpclfO+b3wvaZpSFHM+/7HPsr29ydPvegfnHn+G+XJO\nx+zT7a1x5cUXuPjUuwjWE2UKfDjpgWlkoLEObz3RqmBRa43SbbY/CIcLFg9opdt1BZSiKgw2eOp8\nSbfbJULiBTQEpA/UeUmSRjgCxC37VZYlxhi0tSglCdSUpUGrrE3K6gjhGgSCfL5gaS21r4l0TJJ0\nscYz6PZQAdQKlHNbsFge4H1bVBq8I40jIMZ6z9bWFtZarHVtglyZN9XZ3/ZV9L8WEUKE2eQWxXSP\nfndALCUf/+gnePHVPZJOQhIaQrMEU5ElMd5JLBH9s2f4lm96F0eTCVrC+ctPMb57l0wJptWCSTnn\nnT/zh37HP89D+e2Vv/jqH8LYEuc8IijiJCGJY4RorYxzDi0VcRoRxylp2sE0BudrIiSmqojjiCxJ\nUVHEclFQ1RVCCZIkOSnATNOUpmnaKmQZiFL4D//mPyD8Dq5I+TWJVJrhYB1f1Vx59XXiJGF9a4um\nzpFFQyftIOKILIlYzpc8/uRlbJaim5JhLEnTlNmN18nihOnNfbrdDlmn91Z9nIfy2yghBOIkbpOG\nqm1HIDjyZUGSJiv6OVAWjqpaAhBFCq3itkcnjul0MqSUqxVnBEmaIGQLlihq3cM4jimKonXz0hih\n7Zve11sGFmzLy3/x+S/QI2aoNSpUaFfjm4rSGKTQBFvxjqefYVo7bFXhfFsAYauCfprhmoZelOAa\ni4oe/pDZvwxS1zU0jiRO8NYhRFuunyUR1jr6gz5WQmQjgnQoJfDBYWqDbQxaSJZAbkpGwxFxvwfO\nIlfEANzrx5FKUswLnKmoQ/mm9/WWgeWlF+9w99p1rFDcDoa6rlkaSOIB3c0+iah59j3PkcYBtMLt\nj3nplde40Y8QjaPTiTkYH7Uzj40Y9juE5n6Xcvqvf4wyX1CVJXGSoGQgXxb0+30a5xmtDxAbj3Dl\ni59lZyDJ4ogkijGNZbacMByOqIqKTpaR1469gwPOnztDkmVtwtIaNBaCpshrumsDci95/ROf4NLF\n8+3iby7gvKUxhtJbtjY2KPKG3affQ24dSjjy6ZJgKt549TVsXjBeTMmLiqWtmS8qiuUMV+Zsrg9J\n+wNsU+GKgnE5IUpTetkI4QRR1tZd/b1f+BzPPfUYZwYxcTfh47/xK/zQn/iTbO6scf7tzzK+dY3l\n7RfZztaQSeC1seUbPvg9dHoZzrU1dn/tz/0Qdw/3UUrjrUX6dkUYKSXClkgBIk6ZlCVarspaPPjg\niHSE0m1GfLFYkGUZuu2QIVK6zeWodi21SLcq+De/4d4CplmWYUygNg4rAgMkBItIItxqTQShJcaZ\ndpEkp7He0LOBOE2QKqF2lsgLMhnhTIMHlqfYNmsdg8EaSaLZ3t5htriLInpTnf0twSKE+HHgXwX2\nQwjvWG1bB/4P4BJwFfjeEMJ0te8/Bv5twAH/QQjh577SuN5WxFpRTA+xRYkXsBYn6F4XW+d86wfe\nhxIWp2J8gEceO8+FS2fbcj1rCXgQns7uefZvHjKZT5iM9+8Hi/MUZY0pG+5eu8l4WZFqjSAw6Pfg\numR4wXH71iHNQtJJNL1hjzTJEHpAU1g6MmI+XrK+PeSzn3mNyxd2aZZT4k5K7R1Bg6wNOhgObl0j\nXtshL5d0+j1saHBVzZoacjg+QqGY3dhjNBgibIGZTyirnETF3Lp2DbucsH80Zjqdkpcl1jrSJEJl\nMRMvub3wRMt96qpACUVlDKq0TI5yuklK3AMdRUitUGnK2vY6V/dv8IFv/VZGmxssbcHdq29QFjN8\nHDO1hqaaY9RwVVAYCKtKYecd/W6fxXJBFEXYpi1HEk6ghSBWirKqiVXSekkhgPDEUXyybphzri1b\n0ppUa0xTYhpDFEX3uiy1vm+JJYDpdIqUGi9kG2OkKcI3SCXJdARSMJ5O6K4N23tWAmcFbrUELbbC\nu3adsUWe0zQNw+GQfr+P1hrrDKbxKNUuieStp5YRXrx5gP+1WJb/CfjbwP9yatuHgY+EEP6GaH8z\n8sPAh4UQT9Oumv807Q8Y/bwQ4okQwpf1kcaRRAbPoNfDhYATbUJsfZjx9JNPo5IaQUDHKa6qiToS\nV+Q4A1mWMBsfoGJBiAXDQUIkMkYb5+AT964xCJb1rRFV0/D4Y+e54Xo8emaLCEM5X+CVJu4M2dgY\n0e8NeelLL7B/1BCpChFlDGUN9YLDSYHc38PpDv/4F3+VYSfD1QXGBurGMZ9NObs5QscRejBDxAPu\nTHPiXkaaRkxLi9EJoTRESY+b+wd8wxMeVVY0dcVRPqYucsZ39/DBc2Znk/HBQbuwhDd40T6m+TIn\niQSamMZ51gdnqOqcxXSG9zmiqlAi4m2PXsSWM8plwhef/zzf8e2/DwjoRc6R2ScKDeP9GyyPCrbO\n9Il2tzDGkNH6+VKAacxJy21d1+goIhZtQOyQFNYRpxneCoRqH+9xsAwwmUyIoqili1etxMeLUjRN\nAwLiqI0Z0k52n26MRiOUivBIDIE6VoRlTT+O8LRtwVmaIqxHCkEnTfAqQoXWAjVFSZalCKVPcjLH\nidCqqqjqEikilsvlKjaCibesjZL/f2AJIXxMCPHIA5u/B/hXVq9/AvgoLWD+MPCTof15vKtCiCu0\nP5v3Gw+Ou7Yx5PDWLYIzDEZdBusjtna3QXqm+1eJLmyRdtcRUhOEw5qCNIooTUNVLkmSlMg7RD0j\n1CWp9BCp+67RU4HS5HSkxOYHqOGIuKuZ7O2RxgK8p5jucWZnhDWO9zx1iTuHR7z68ks8897niMZT\nOqljq9NlvL/Hk49tUhZ9pEq4c1Cjtebc5acYbY0w9QyddXEigtCWjMtE4/KS0nvSoqFezIkjRTeJ\n+PwnP4W3JUJL8rLEGUPSyxj1uxzd2ScRkryYUZRLbt89YLGs6fT6RL0M6yHrD7l58zbGBXrdGB3B\nsJu1fTV+Riw1t+/u8aHf+yHqvGJ+cMj5J8+hZoKNZy7w0X/wIlu7Zwh+xjBLkMSrxf4cnjaPY52l\naQxCQGPMSc+JtQ1xmlKVDU1Zk/XSkx76xhjiKGI4HLZtw6u24rppMHVNlqToqF2Ew3uHSmPcA+28\nUiqcc23JvBBtBbbShBBorMGHgNYaSVuKE/UU1oFKMqpygSOQlwVy9csBUkjyvMB7R1VVJGmCM6a1\nwlJRNxXb6xt04jfHwj9vzLITQri7en0X2Fm9Psv9wPiqP5HXW+9z6fFLpEqys7NGlCTMFgvmeUG8\nfoHDWcXs1k2sF2TCcef6bW4fHLHTVRxOJ2S9jFA0vOO5d9OYCuc9nej+maEQcTtrKYeyCb0kI3hJ\n2u0xOTxgY+MMWjuaylI3NUIpLlx+kuHWGa68/BLn+xl50TDsd7l+OGN7d5emXtCPJZvpFBMCXmwT\nQkTa67LIC1gucLJkeuuAsxtnaHyAXsre66/TS2LyqiJNM7xZtnkLFbG1uUZVLDj71EW++IWXEbXl\n7q1b5GFBqjsEqUkHPa5cvclg0OHc+R2m4yNEEFw4dxbbFOSzKb3dXQ7u3GGjb1DdPnFIuHP9FsPH\ntnj1xVe4+Nyz/PLP/p9c+7Was5HlytV9BmuOaPQYF0REAJxoOwYbawlBEMVJ2yBlDbWxqDhCBkWx\nzInjmKTT/s5JU9cnPfRytYCFM611ElKik4RIx4Cjk8ZtBXBtQMhT66C1Yu2q/z44hHV049a1+/+Y\ne+8gy7K7zvNzzrn+Pp+usrK8UbV3UqutpFZD0yCHDGYYZgSIwQg2hmEwwcIwBIsQLAjYiRmWZQd2\nBqQZIZw0SCDTkrpbrbZq77tMV2VVZqV7+fy7/p6zf9zslqqB3omdmGidiIyMeJnxTmSec985v9/v\n+/t8ix19jJIS27LJdvRo4/EYpRRRlOI4DlgBg8E2S3MdLGWjjUYqi8k0RwhFXmh82yHXJdpoSgFJ\n1Aftveqm/x8O8I0xRry6k9c/+DPfcphpBJRpTDwZMBnBOMt59rnneOMVR9HdFznSCIjTEpNktHfV\nefPbbmf9+CkWDu8jThKsoiQMArqDPq7rkScXZjOE1+D0xhrnVleJkozr37qHe+66B51EvPDCCyTT\nmN5oRKkNc60WUsHevQfw/BpbmyuMaoLdrSYGm6NXXcEX77yXa6+6CtdXNFpNRlFKvbMPKwjQWYFl\nW0ztkprvs2spJJtOyJXAkYbxqM+ew4cZ64w0HuPZEmEpbNuhzDMW2i2+8sUvMducYXtri6Qs8IMa\nX7zvYa68+jrOnDhNogUmNoTThEYQcvzUSdozHZLpFKUklrTJoghPSWphndVxn1CGWN0pnRjCCK69\n7Q28w55jTSSs3vsQF938Rnp9gZIxkjrSyJ2eloI0zXfiDhtZFtW1ME1AChzHQeuqlaIo9csxiG3b\nL4MgvpGAKahkP5ZlMY6zSt0sLMhy9D+ANfpGakueV7GEbVXE+7Isse2qwe1lLZllVdjWnRqKpRzQ\nkhKDLjVJGqOU2iGIVno123NRtoUXBGjKlzNl/9j4//uwbAghdhlj1oUQi8BLkfUrLfL27Lz298Zv\n/u7vkEzGuLbijVdcxJtuvom5YBYznZB1N+g4NtlwjCSnpQWr212Gxx28xkJlKpRmuMIimfRpKMl0\nMsBXF/45jl2yd2meA0cOUSgL17N4403XQJxzw5tvRmcpZZLxiU98nBuvvJRmIMimJWUhuOHo1Uz6\n68gsxRJguR6uN8P2OCNFMlodkuUTOqVFWmhsx2UjzojWV+n2NklHY5Z2z6Fci7m5OS5+3VF0muAI\nw7RIcWyBshWO7WDKklFvmysuvoQH7nuc4+fOY9db6M2Epd0X8fijz9GZmaXTqpFGEyajFG1ChtOS\n1bObzMw0QFtsrG/iWQqTJfTXVpg/sJcyd0mynM7iIYpPfY3gWEKyZFGGHda6Z7hx8R3MHPGYDrao\n+d7LHZxCSHzPr+73O+ZQcZ6S7sjYX9pYaZoSJ9WV9OUC3zekaIuiqDhhQiBFRVLJddU3E4R1imiC\nKS/UZE0mVeXfaI3luigjyYps50EwOI4FaBzHYzqdkqYpSZIQ+FUPTVmWNFotUlHFR0qp6vqnviF+\n0jmOZfHCuXVOrGwABv0/SUj5N8APAP/7zvdPfcPr/1UI8btU16+jwEP/0Bv88q/8W7ZWljHpBJEm\nrC6foz4zz9zMLkJ/AVPmOI6mzIfkhWbP0UOsbneZ2dshS3OEVMRZRpZOKaMU3wnQXFhUshwXpywo\nigS7VJRaINEoyyYXCUyGFHaIqwy76w6ua1EGhjwqMdMeDc8j1+AkCRvZBH/OZ74TMC1GtHbVsHAI\nnIwsh6IsuajuEssZNmaaBGXBfDNA1Xw0GmFS8jQGUVILIZ4k5AlIP6PZqJFOSh559HFePHWGQoWY\nwubsmdN4vs/hPfuQ2qDzHMtxSbRmfW2dZtCkuzlgZa2ytDObE66/7BjFqIcdOMyogGfOnMeuBXwt\nOsfkzEnqT41p/tQio9ObLK+fYLTxIOlggGzcQDi3n9xMsUtJluZYllt9OoucaZwzmIwQlsKyLeId\nRzUpqg34jYyxl5ICL50OeZ5jdnhglJVzl6NUlSQpDa9M/+Q7rQFGG3Q8oXQkuRborGotfukqFScT\npJQ4bg2x49GSpSl5XpUiHMfBcmzyHW1ZWmjKNKEocxQ2g2mfpYUWu+drBGEdx3H5/L2P/aOb/r8n\ndfxxqmB+VghxDvi3wG8Cfy6E+GF2UscAxphnhRB/DjwLFMBPmH9ET6OEoFGrYQIPbQTtvYdIM0PQ\nmWWUxDz11JO4xhA6kt2HDmMM2Ic7vLi6ijaKYjomiSOyIiHujbC0ZjAZXDDHp798F51aDWyLhYV5\nXNsjTlPa83NIpfCkTSEVs/v3YbwQ6fnYYRM7zknTIa4UYEdYRlFLUo4cnketdal5Fo520FohS03D\ntclMQZom+KqKiRh2SeIJtdDDdRRFlmDJkrxIyYZD0qxkbnEv08mYXq9HmhVIadEdT/HnFxhPE3a1\nW+xvN7CQWEphScnaZEJSKupWg6yA0kgG0ymT7gAntNgaRKT9IUv+HFEcM78wz/nNNWgHjG7Yx/z8\nPoJOh4989EO8/uAx0myE0zpM48AiAgtHN9jefg7LqlBHVXarBCNwPQ+kRGuDZYmXC3vWTqbJsizK\nsrwAomfbdmVhZ1kgBNJUbmy6LMnzAqUl+hX2dMKUpHGEa7kgbBQ52DbG2OiyqHC+WuM6XnWSKSjy\nkrKoTriXTg/LsiiyHNd1iaYRvu8zHI2p1WqkWVk5rymJlA5FURAGwas+C/892bC/Z566M771H/n9\nDwMf/v96XyGgiKZYWYoaJXSnCXPzuxhGU4zR3H77O/nzT/wZt1x/Ha12B6fVrGKL2Qnd9Q3aFx8h\n6W4hQ49nnn6CK/btY2t5Fe7++hzX7j+C63nYnkeBIZjby2AyoD2zQD4c4IU1iiThhlveipunRFnB\n+Y11NnpDPvnxv+LH/9n7GPTXWV/vsbB7iaNXXUOcpoQzAfE0Is8kZ86cZWF+hkDYuK5Pvzdk3yVH\n2Tyb0ugEZImh4TqkJidKUxxLo22XWrtGlGdMkwwHyXSa02i0OXzsYk6eG0KacdHuReaEQSlBXlZ1\ni9x1KJCMk5QESaYlXqNB5Lt0JxNOnDzDJcf2MxgndHa5xHGJELC5sQk1n7X+Fm92u/z6r36Ic8cN\nT538M97ytjqOu0GafxnLvp6Z+UsRpCgVgtRoU6KzioqsdYLWlSJa7DiHvXSqvETyfymmUEqR5xXF\nUwhJkiT4jkuWVoF/aQyOpUiSC9W+RkiypKB0JcJIGgYKnSIchdiZJ89zdJnjOA7j8QTP84jjKmYN\nw7CC55UlruuidYVuyvMcIQRJkpDnKWAYDSueWF7kDLtbr7pnXzu5i84ok4TQc5nqAb6rGWyfp9Fq\nIZ2AJx64l2tvvIH2rIdROSURBWAJzdzueZJJD2FX0IJWrYbMEhZbF2rDOp4i1RHjbhfH9+iPJhSN\nOr3VhBnhsjXcpBa4jNcHlIVhezphz+ISTql513vew4F9S8h9HTYPZ9Rcm1Ll7F1YYFiMIc6Zm92F\nP7dIMhkx1+6QJAl7l/YyyTOWXneYctKn6VkU8YQ0GmKRM00n5K6LNIa4P2bUH+CHrYqemRhWl8/w\nlutu5cWnnmZeaRwrQCoHnacoXxJMU+JS4ylVIX2KktxUbK+Glkx1wnOnl3nHrW9mc2uNwg2Y7XSI\nkzHFdsQNFx+hdWlAOvgqxy5/M/3kvUSnfh8jfxJ59PUYbgMRc+u33c7nPnMnUlUBsusoTFGSGwsp\nKw9K27ZJkuRlSr7RO1QLvo5IfanfZTga7iiGp4R+gFCSMs8xlmDyCrhdZ9fBKi0kJePuJg3HYZTH\nCE9BUWXaiqIgTxPQVWq7yLKdXhxDlmU4jkO+I7osyxLbsnAcpwrwq7fGcRWCirtsJPjWq8ulXjMx\nlZEuTz/3LMtnllnZ2mazP+Xs2bM8/OjDrK6vcebkCRZmG6QTg85d0khXvQ9CUSRTXM/BEgLhCR6/\n/yG0UUzkhdmMTAq8Thvb9QlsjyAMWT79PMpM0FZMjRy7nLKr3mTY20ToiHS8xt45icpHGGMRRxkz\nFlh6ihl1OT9dpxgOmAkc4tEWcW+daX+V8dYyWb9L2u0z5/gUgwllLiqYnMwwlkWcCQw+tuMwHk3Z\n3urTbC1QZDnZZMj9D97PaDLlxaceYVdgUffr1D0Hz7ZwnRCTuXhuDdvysaULxuBIgSMkFjahY9Px\nHOZqddY3N/D8GmY6xSMnoAJ1iMUaz37tOJ/7xF1M1p/h+fv+nN//Px7n/Pp1ePYfkCdLnHz4Yc4s\nP8Xbv+N2hr0NJrHmN//wt7n6+iV+9Cd/CM+tjICyzCAthet7eIGPdGws29phHwvyvEQIRZEZhNXE\nCudoLx7EbS+ianN4nSXcxm72HbrignXbvfcQl7/hOt70ptvxfZfSUwgrIIsjkiglTwuEkfhBC6lC\narU5mq1ZOjMzeL5bVf2lwFgWuTa02h2ksoniFCEttBEoy0MIFyMgyWKSLGYcf5P24Buj6bTa+I0G\nbqfG4q5dbA/6bG722Hf0KIuH92FJiWmHJNtdQPHc8yvouOSSKy4miQqyVOOMC6658WbWooThaHLB\nHB/7uzsxesp4FEOhabRmWT57irt8j3azRadVY9AfMDszy9WXXIYUsNlL6Eea7V6ffhYzPzuH0BkU\nJTXjIZVmuL5OqTzqjQDbDchXS3RR3c+jNMGaDMEYlGTnqmBTpBm27SCFZDKaIIykudBB5xlZPGLc\n6/L2276NF194nnG3y2xYxyXGtZqkWhN6LlJq4jzHsxSFbeGVFhka37JIigJHCtCSWhgwHA5pNBoU\nRUGSxszPzTEcDfnaVx/kzd9xA8nIZWNrlTRc4V9++D/jNg7RGyj8IGf/6/aw8MWEJ+65n7oTkFmK\nj/zGz/PjP/0+fuPf/Dt8t0McJTiOR15WsVZZVo1fllXB7F4K9oUQBKHPRa+7ksk0wvc8giBgfn6e\nMAzJtWFzcxO+Yemefv559LPPc3DvYZRjg9S4foDIC4RtvXxyWJZ8WWavlKQozE5/i0thNOMkocxy\n1tfWX5bguG5Vi7NdB6lBo/HDgAwHk/6Py13+pwwpFTMzszRsh3TUhfVzTDf7bG9ssWbD7OwC0bhP\nPumRTjWzu3dz2aFjfOnLd6L1Ufx2AzeH4foa+xaXSJIJs53WBTHLntl53vnPvw+tFOgSU+RkUYxK\nYhKTUa/XSeKcJx5+kt2XXYnTDjDSkJQJuy++DF2klJbAcUISAdKp0z97EsezUAJyZdGo13GX9lMk\nEWWWok2CLhJ8zyXLEzzPpUwzhLKYjCNCv4YuKqu2Whjw/JNPo3QGecz51dOM19Zo+g4uhsC1sSyN\nQDHNEtBUQaywUBJ820ZqTazBUQ5loSuS5GTKzK4W9dDDd3YxmYzpdTdYWFhEKofHv/ooWTlkYXgd\n1138Cxw//xxHay3ajQ7PfflOxs8c5/pr3sjifgdtTfmLv/4K73rn9/Drv/x7jLoBt3/r6/ny3fdS\nlsvP61IAACAASURBVBnTaUwSV6annudXyGO5E1zvpI0znRHUmnRml/jyl76IVFWjmu/71FttOjOd\nC+44b3/b+7BUJaP/3N88RyoL0qxESIPZKURmWUYUVQmIl1LHaZoShmElyTE7LLJaHRtZSWx2hhCC\naRpTujZK7tz4lIt0X72E/9rFLBi2t7exZ2YIO3WG68vUO22sXp8Qi7S3RZEVCJMxHvTYvW+J4fp5\nXE+SZGMcXAoMniNIoj6+AGVdmIM8cunhCsRAhlSCwtJYeUGRxvg1lyKf4kgXqSEZTjBFQlgLMEWG\nsCVRqhG+xyROsSYarxOw/MDjzNRslKdoNGfYOPH8TnHRw3GsCkRnBxRFhm1XfvNGS5IoxfdDSgNu\n0MRSgkGvS6AUlmUTmwJSgyUUtu2jkRglENKgtMFSEimyHTvvEscSlKVFWmYIXYIWSFmRGxGCuh8y\n2O5Sq9WqrGNZkCZTIKfWbpImCnts4Tk2szqgLjxiMcvhm97CVrTJdjdhZmHIkyvP8oYb3sjy+RO8\n4x0f4NN/8SmufP0x7nvgQaIIlBJYygIUeb5TeZf6ZXdm17UrCYzrEoQNLr/iavYd2lPZZhiDVDaW\nsuH419etOxhgKYdds3NMoghsTZwaTBmjtIVlff3rJf1aURSEYfhynFSUBZZbac8C293piCxIkqQq\npoY+k+mUIHAR2iAwdML6q+7Y187MCMFmf8Tq8mkMKQszTaKtmGdWNhkNpoyTbeZb83jagLQ4cfos\nolbj5KlTzCwtsrHex7YdmrUGQqlKERtdWGfpJQU5BqUcdJljpTl5nuN6LpQaoxRJOcbpBLQP7SMf\ndslMSZGnyEwQdXt4ZYdQSbK6jw4EM3MzhEJjlMazLYzjkmcFWR4xnJTMLcyjpUuZF/i2iykzknRK\nEHhou2AyHCJFCUYxHg/IihTjephmh7VnXsDdobdgqiq6ES6lKSk0ICpL8DKXKHKkneOgEIVDrlKM\nztFoCq0Zjyf4dZ9wR7k7N9Opsk7SoNMMz3PYiM7Tv/MOZg/brJ97is6uH0M6Id61HnNxQGAfZUDG\nsvF4y7d8O1+75w7edMstrG33ycscpQJ8r0QIBy2qDVvoogJ851UxcDweI6RECQdLKZotn1F/wGOP\nPcZoNKbfHzCdTitV4c74v/7jf+D7/+kPMrtrF7b0UCbFpAOU42AsQ5pF2HaNQheYUr9ceMwL8XLa\n2rIssvEUrTX9KN6RyFgEgY/rOpAXGMfB7OzFsOaTq2/SmEVjUQYNrr/+RixSTFGQFhmXX38TJtbE\nMiX0auh4RC0IOXnyNIePXMSehaUKbaM1ruXQm6Y8/7VH2DXbQL9CSLn54kk+e+oEqcmxfY80LfHc\ngHqzDkVJoksarRoba30eefw/MVsLiPMU23UrA9VsSpHl2GEIQnHRxUc5F4+J+t3qATq7TFgLkUZQ\nFAWNRgsjoMgTPNcmL2KiaIQjJaYoAU0YeKytnqPINaHn0NUFNhD3htRtjyiKkV5IrhWJtsmLtKpV\nFBGBkhSyxAibwkg8A4UuyHzFJJowjEHrjIZvkUdTGs0a0WRK3XWxHJcgCDi/uUWt0WCUJmz1+uxf\nmGf73DZZLUD1v8wzDz3Ere/4OabyBUYbq8y3l4iydR594S46e3NufeetDDamPP/08zzy2GmErlK5\nRr7kpFVJ4JWlKq96DGUByq1uAsYosizn+PHjSGlIkuLvGW5VJ5KDLqpOSKlKLNtDKJs0niKRDMdT\n3NCjKDSFBiNFVZz2fIyArDDYVqUstiwLU2YYUaC1ANswSSoIoJQVZD1LY1z3m9T5SxhNx3fJNtZw\nHIUM6tjaYJItdFIyTgr8hsaMVom6BfvrIWK0ipQeWVFpixCC2VbIDd9yM2V3vSpifcP41re+Cdev\n4QYepag29BNPPMf1N7+ZLE3QsgSdIkpJmRuKLMZvdxBeQDIe8sAXP8Mtb7mFYQJ11yHu93D3HKJ2\n9TVo28OWFo35ReLuCooEy69R9Hs7gLiCIjc4toSdanI0ThGmxLIM/d6Aiy+6jGE0YTqM6K9vMdOZ\nIYpyEmERJSkmymg0fPykxEXgG0OgShK9U/XWJcq1OXXiNINJnz37DmJkyKC/Rj2cZTIeM9tpY0nF\ndDxCug6LuxbZ3O5jhyGBcpicP8fuA5KLjjaw5PXc9t23MRo8Ta+7he9kzLRmUa0ttlbWWHBtNk4/\nyKlTCT/8Ez/E8Df/kKcffwZjAoxUO3KWEiE00naq4F9AkmRV+huFY/vEUVZ5UiqJ1gXGXJiULYuS\nIAgY9LdxLRClJjUCkZdIqp78tMjJ8pIsy7FtizIvUQiSKEJCZVRbxkgpsZQkLRVu2CBJKh8fYVk7\nxVKJp22ka6Hsb9IA31Aw26nhlAmjpKDRcCioIAWZVJzd2GLP/qOgY4a9LvXOAuM4QTglk+GQmcV5\nBGBkhO969IsC9QrVcbPVIi4SUp1gI7H9OkoLpmlEPq0+sTytUZToNEcoB1PEZFGGbbvMzsyRpBPc\n+jyWhHqo8HSDIitwZIqFYfjMoxDa5MqQd3sEbqX1QpZoSiylGE0m1MMaZZGSxymzM4v0e2P62z3E\nDls4n0ak0qEQgheXj3Pk8FFwPc6snGNtZYXdc7Mc3b3ATOhh2Q6jacKwiNjaGrO4MMNcp4VvCSzb\nsLh/iY3xFMfV7J5foCgyglqNHE2aRrTrIVgOvrBwzIRsvcfqgzHjxl8iWzNcsXQJI89lkm3Q8prM\ndRZYXHobrtTk6Sk+9kd/ykynwbu/+708+8TzaCojoTzPUZYgKzKKaYnQJa7vMzElBkVZGISlmEwm\nleGroTqJ8gs/0Y0GIV2EsYijuOqudNwdqwlJnBdIy65UDZaLNobQ9ymKKogXjiIfxwjLcHp5HWNJ\nvu1t7yJohkhdMuwN2TzzLMLkmNImMzlSK3T0TdpWLKTk3OoaT5x6gbl9eyi3NqgHNUbbfV549jj1\n5gwfe+YZmnbJzMwcZ9Z6+GHIuNSEvs9oukJ/NMCuUv7kk5hGvXHBHJPplFq7XREsTYVaGo5jfG3j\nzjTIoiGIJsICx44QScnK2TX2HDoISBaX9hB25sANMFFMkibkpkRIh0mWgcmIdEkDH8sYkIJ40iMM\nQsoiQ5iU0bAHaHSZM42nBJbPYKLZNTPDubNnsWt1Up2jdQG2pF1KDh85wrQs2FjvMtgc0ah3aAYB\nXpHiey2iwmKiC3qFYHOYQdxjYzBAGji02GahXSewLHq9bUaTMdPpmPmFWTzbruQoZYpje+Q5lFpQ\nl21m65ew0XuI66+5heloilU7z0xjkf5WFz9s0518gj2tD9LdvpuZmSa/97u/RRovMJxKgqCsoHay\noki+xAjLy5KsNOiCqgygNcZSdAe9qjELgdE5rxSmK2mhhE00GVOrNaqemrJAKJCWwpEKy1bk6Y5j\nsjHoMscTFslkQhkL8jwjtQXDXDKJC/7kE5/m/e9/P3fefR9l3kSqvRRGkqRxxSkzJZhXB568dqlj\nA7sXdzErDcqx6U2rKvjlV9/IwoGLsYqCvUu7kcKQ5RmT0RhlS+pphJNmlFnO/ksuIo5Bi6p91Zbm\ngm6ax58+wYFDh9HTEcPhgLAVsn/3DM89/RA2JcMkp9/vs7yyzqF9+7j02GF2zS+QbPdxZju4nssk\nK0lGfYgi0qlGa4MTVp+iG8sr5EWGOFcQOhahr5ibbe3cfy2yOKMWBOTJDhNrx6Yc4TDc2kIlE4yU\nWMawNDtDvz9g1neZRClPrvSJ8FhOLZxY05v0KBZqtOqzBNImLQ2n1vtME8P2YJvUFLi1FltnNrk0\nz9jX9CjimGg0YGFhhiyJcJwmURrj2A6IqlhqhxZ9pjy1/igLr2txfm2ZjRWPkq/Rmb2E2fZeLOkT\n2lNKdTf3P3yS9/7g/8qffPJuymmJdGP60wlZmnPrW97KqZMncbWkoODd/+R7Ob+5wdpKF+GVUBpk\nKhkMthFoEBWf4JVttDpPkWTE002yNAVlUeQ5RaFRlsRxXKZp5XVZb7SoBz5KKAqlEK2CzY1NjOOx\nur5GaTtIAXXf4a8+8TFi2kysWWSpUVaKqyoPoFLWKKTPq43XLnVsoB7Wqe82IDUH7TlwHaLhgCga\nsOBb9DfO4IQdwjDAU4ZiOmEyiTl25ACT1WVMNKYmArAqz/YyiS6YotauM1cLSIuIuYPzCGkjjMc0\nmuD5ksWsQB3dy+F9B1g/v4JvYvKtFZQGXUwIVU5ydp1GYxZT5tTaDb585x3ceM1llFHG3vldGCEJ\nQ4diMsB1JUk0whISN3DI8pyszPCDkLVzq1hK4liKtZUzJL0+tuOCkDhSMO71qHshxnE4eX7Aid6Y\nqR4T2ZJ99VlMWRCJBv04o+G55JMIx68zysfsbc7j2Qqd5XidFq7MkUJTr/uMRyOkquoegV/DkQ5F\nVgBRFYA7gqNXHGb36xeQqiSPW1x5tcVo89t5YfM8q5tf5NhRTSN8J6J8L+95zyJFIXnHd74Psgmf\n+evPI5ih0WoziWI6c/PccONNvPs7v5P1tXWOHDyEbDYpAKkFo8GYtbVlRv1N0iShVvcp8wsfl7mZ\nJh//r/8PSRKjjCbPStI0xnY9ymJMksRVihxDkZ5Gm5ISQylN5Tpsq6rdOAgQhYNOUkrhEI/GZKIg\nDzvIdEASn2ZaTHBFh3D+dUi1wKuN17DOIvDDkF73PI1mQC4N8XCAE9pE4xFOczeulMTpFOVZRMM+\nrdkG5WgbsgwvaJAUKb4vmCbjysH3Fcd57/QpzMIcrmchjCCNE4wSSEcRDQcErks5TpDGQClohG20\nkMRxhMkysjwjCJvk05gkmeL6bdJhDFFKuxmQGYmtJdNRjzKboguFzkvwbTKtUbaFJKd7/jyIFN+t\nkfaHiGmPsB6Ql4Ykz+iubxC4Vd1gnGhs12ax6YLyUDn4lk09CGjZNoHjUJM2HbdBoyyYqhSdldhG\nUnMDpBIEnoNwDK6eYAlQUlDkKVE6wfd9hFEYYXAdA6VHb3NA/4UhV139NqQbk5YPINonsTZez2UX\n/SSnzv4Fs0dfR16c5KH77uKLf/kijz37BX7xV/+YD/z4z/Pe73gXRIp3f8/38sabr8OSFqe+9gj9\njTVefPIRLL/BpTe+nrm9+2nO1PiXP/NTfPHLd9Db3CLLU/JX9LOsrKxhpMZ2bIoixws7HLvkMm79\nllvYt/8AjUazQrtGOU8/+SSf+rM/J4tjSqsgydIdILhEGYVQ4Dl2xSDzOgiVIpgjj1+gU68hVIAu\nbWLbI416r7pjX7vUsTREScxTTz5FpDXCkYgCDBaXX3EF9z/8JNvjHo7lYSU5bd9mFA1YG0XcOf0C\ngeuxZ/8+1tbXkFRSk/IV//QTp06zvdUnyzJqYQgKZL2FKUs6oc9k2MO1XCIhObfZ5VzvPIuLu8ny\nAk9AI6xRRglzczNoS2BFU6K0ZHl9E7snka6Pa/ukUYSlSopsymynUxn+FBlKiEri4Tgk8QjPdRjG\nm3iuw7nzm3i1OgcOHuPJe79GU1Qs4cBIjtQ8DvhNpOWzHsXEWUmoFDNBQNN1UIVh3q+zko7pBy5y\nx4UXIXFV5d6lC43BkKcpQtewLcV0OmFmdob+cIC0JYHnoUSdaTbk26+b4cTpj3HwwPuIRrczHN6I\n37yfSfQ4lx/6EXrRXTz2gMs1l1zOZ+SXuOr1l3N29Qv8q5/4Icaxxx//we9R9y1OPfwMu/Yc4PGv\nPcNFl15Eu7lAFI2Ybqyze8++ytbOtfjYJz7O937HOxmPReVn+Q1DuQ7XXPtG/pd/9VNcc82VbK52\nWV/vEtZd4skIrcFrNNhz+T5ueutNvPu738sv/exP09/cIkniiqVsDEmhSbNkp+++JE0sRCCqqxmG\nPCnQIsOWAilsorT7qnv2NcO3pjoh3e7C9jrGsdlYPY9yHeIooeHWka2Q1t69bK6s05GKlaceQYqc\n+cNX0JztsN3dxlKSwbCH7/ssLy9zxRWXUfvoVS/PE73/KUxRkMQxQb3G+vk1li6/kmh9A1PEOI7F\nNMnI8wwHRc31kCanu7lOUQsZdfvMzjSo2TZW6FMYRVFmBKGPsm10alCOw8MP3c+h/UsIMiQZstiu\nbPrilFGvT63uI/2QZDRmY/kM25vrWE4dbRyk9HjqnvuYV2AFIaktKHOFwCOONUiFNhbNwGM+VMz5\nPqKQrAw0D6xt0fUhzjNcY+NbEkcJXEegRIaQCVqU1Fs1LNui2enQmZ0hShOQJY2GT55CZ9+UcOkz\nLCzeTuQe58HPNbjt7b/KfQ/ex0XHerQWnmPY/y4OLbyBc2t305mfwypez0984Id55vH7mBY1/vl3\nfxct28VWHaZRyiAe0+w0+NZbb+XOL/0dt7z1ZvZdeQ1BZwZDSTTN+Rff989YOXueKC544vuffHnd\nNn+0yyf/4u/46n2PMFpfJ+7FNEKbvXO7EcIFqQgbTf71hz7Is8+/wBtvuQokfPavP82f/N9/xGR7\nQFqW5FRFyl6/y+xci34vw3gWsvYWZP8eMAlJMcESPt7Sm9jYOMH501/95sO3gsEOfMqeIbA03kyN\ncZqx78AusjSjFw2wsjmiYQ8PzUJoY5IEN54S9wtCJVEUiNBhONhmdfkER/fOXTDDxolncX2XLCs4\n89wme/btR4971D3J9vkubr2GLW1qtiHPxyBBxhNma7BhEtZXXyQ0c1iuzdrTXfbs3oXnemyfm2C0\nIZumnB0MOHzxMVxKMBlFPiKNB5CXZNOkQj1RMuoPsYxBioL5XbvY7I3x/BqP3PcorlYoVeBKTWga\n5MpinBmUZVFTEicIqNk2LV8Rmwzfq+HXLDqegxY5iRNgaxvHkVi2QImq18OybbKiJIoiOp0WyXQK\nzRa+61KQY0nYtc+lfmzE7P6LufNvQ9p7C975zt2Uzt9xyaUBQbhENOxwcOkW4knKM8+e4S3tt/C/\nffit/OKHfo5/8u7jOFnCrAtJHLGxnWIVGSMT8+yJJ3j3976Td3z/9+Aqge25pDu22q5r84M/+iNc\n9rpLefDBJ/ie0+97ed0++4kv8Td/+lmGyZSD821mQ4elRo143CcSFgiLeDLgD37lD/jAz3yQ8UZK\nOO9x+3u+k3d81/t425tuRU8iBFUXp5IuQiqEFNhWrSolKEGaVoazStkIrbDk3yN2XTBeQ96prGAH\nXo0SidVqEzaadLeG4PgobZh2h/RXVgg8j/bCHLVOC6vhM8gjpOcw2R5RxiWBqtNZ2E97Yd8FM7Sb\nLSyvCbZLzbc4v7GGDD1yZbBdhzgtKQvNJDckpcDkKaPJiK3BgByLJM9ptmfJ85JGy+XkymlePHcW\nWQrOLa8xGI65+JJLiJIxo9GQLE0g1zjCw2QJzVaLAovpqFIojIYjmrUF0tLGs0O2zp3FZGNa7aBS\n5xpDy1UstVscW5rjyFyHphfiK5d4EhFPpggEjSCg5vrM1mq0HZdyOqIjE9qWpibBEwZHSnRaYtkC\n37dRjoOkxIiC8WBILQwYpyXPnl7l2cc3OP7wpbz5Xfu55KjLytkvkFj/kVZ9Blt1GY7uoExHnF15\nkF3zbUaT3+GDP/IDHDpwG65lc/MbriQtNBsb2+g0ZxCN0bYgCF3+029/hIf/9lPMLO6iEGVlq6EF\n21nKbW+9nefuv5cv/YePXbBuj33yyxxeaDFTb+IYgTY5WZESZWMcneOUhigasPzCU1gdyTMPPA2l\nwShNXoz49N2f5fU33YTJBK7lUZYaiY9lIgoCTDlBiJQqR2AjtSAVFpTfpP4sQggMijjP2R4OcX0f\nS4Ezu8gn/ua/kYyHXP/W21g4cpCvPPEoV1x6Bf78UYbDIcL1uPu+B1lot+nFU1q1FjNLe3jyxJkL\n5nhopceRg/uwF5cosgnlcEj3xbNkSQRJBLbDyZNP8vyJZdxGwOJsnT3tNiK0aNsub7jpJmq1gHCm\ng3IDio01yjjlsWef56Kjh4iSlGBXm3Qrx8oLlMqrnvUdtS3GYNvODlPXIgxD4t6QRj2gFyUURc7u\n+QXkIKKmHDxh40mB7YBybITQ6BL6owlZPObAwl6c0Ke7sYznNfGtKcONNZphC6kjHA2W61JISV5a\nROMMZWuk9EBILOWQ5zmObzEdTGi36kjnHIcvmnDv3fdx+A3X8tzj8+zddy35+iJG9SGIOLbvZ3jw\nmfdxZM+7aJWCpx/Nac6N+Xf/579A+RPm50JGw6o/PytSSi9gbXOj8tTcHiGOv8CN6xvMHzyEERJp\nDLXQZXL8HLe8/3s4evFV/OHnPvnyuuWlRVAWLIWCU1s92rYgd2zKKMUENihBlCUYUj7yy7/G+nKf\nZ08c5/0//d1I26WU8KHf+TC/86u/yT1331ORaCyFctsU9jyIJrq+D5GBLDS5UhRlCo3Oq+7Z1675\nyxikUgil2HfwMDNzcwS1Bq2FBd7zT7+PH/yJH+HQkb3snp/ntre9g9nF3QT1JocvuYyyyLnx+ms5\nfOwA1152jEuO7eOiPbt43VLrgjm+5ebLCMt1GsPn2dvMOHI4oOn3mWtFzMynzM9m3Hj9Et/3A7cx\n33C5evchDjfbzKYSPRhgxwXrJ8+yfW6TlRdOUrNcvvbQY1x73U3Yrktjts1kq4ez4y2npMRQ2bkJ\nCcNBrzLgKXKCWojWmiAIGG5vs7gwx0yjjjEl/dEQLcDzHfzAwRYlggIpcpJ0ymg6xHIUg+E2L544\nySiacm77PDma2V27sT2XTChEkaDymNC2cO2Kxug6LkopWq0OnlujKDXGZAiV0Btuc3a54PBlP8to\ntMgdX+iirHMYL+fOzz+GV2tzfu1O+uVnufHKT5JOXqAs27Q6l/OXH3+Wuz6/wrRvE9qCelBDl4LE\nTEmLiLi3TduxsWzNsSuv4tzZ8+gcYgTaQL0seeCP/5QUOHjlRResW6vmU8Q5HQnSkgjPZ1hkbKQT\ntvMR65N10jLj9KDHC488ze3f/3bOH1/nt3/hNygyDQYcS/Jzv/qzfOqO/0a91ajImMLCtmxQCZk6\nSOkcwPjHKN3DSAIawe5X3bOv3TVMVKrayXiE0RmCAj9w0GSMtrcox31UXtVNTJRh5QVWMiaxUqwy\nxpUFHikqHSPiCXrYJR4PL5yisUCOojAaWWuSZwLLeMjCRZYuZa5wpUuawMzsPFkx5Xx3C2MJ9GBC\nvD0gUD7TQYyexqycOMlsu04Wj6vmLiGxckMZa5IkwVI2xpQIBLblEtYaWK7P0uIiaImUgixPaNTr\n9Ltd2u16VY+hJC4TClVZJORpzGjcZxpNiYZj8mRCs13Hdj1m5juklstGotkqCgZZwViXlEGN8TRC\nFwXoHC0qz5GiLDFaMOj1KAG1w/8tS8mLK+uMjMVX7/kMx6cvkCf7uOIKzXNPHue62ySby2fYXf8g\ns+GPkZgtOntjnjv+FS697N18/nNfJUsLdJnieQ7DYZ+yLKnXLXr9NXxbsHXuNLtbIQePHOXya65B\nOjbBTudxZAxv/qkfwhKGIplesG5lMWQcpXjSYb7dZJIMON1d5dSozzPrZ5jkMZ5rYzfnGU0H/Jf/\n8id0J2ucf+wkvbUuyXhaAfwsg+UovvLQPcy1W1giw8qnZKOTWIM7sUd3I7b/Frv/VcrsNOONfxBE\n9PJ4zR4WbQwCQRTnnD65zKkTp9ne3mbj/CrpeMqZM+usbwzor58nKmI2N9ZI05Kyn3D85DmeO7NK\nv1Sc7cWsbicMhc0kvfDO2RtOSI1FnJZ0J4aN7YRTa1scP7XCqZNbnF7tcfzcFtvDIeMy5cnlZZ5b\nW+PR02c4sbXOydWznNo4x/LWGmvbXTzPI00TTp49y2p3m5OnzvLC2honz6/j2CHTOCEIXIyS5EWJ\nsl2U45DmGUgLoQR5GeMHIUWSMuoNSOMpruPvqHYLslwSxwnpNGJrNGW1t8HMUhMocKRPmhqUVQNs\nnEzjjxOasUaVhrLdZn04QggQSpCZBEwFnPM9F+laJHmJstpsbA3xF1YRQYSX9fjArft40+WbDE/2\n0XnKHXeMUPWYqHyEbvcrIB7Fl7fw0T84xbVXXwK6TpkL6jWbKEuphQ3m2k327WpQq1kYR5GjCWyH\nycnjDKcbDEY9clNgpMBWDv7Cbkqtefwz91+wbireItM9TJ7hCDi5eZrCElx27FLmw1k6Xp1ZCUv1\nBq7jEA5ybv62t5BOpvzKB3+aMGzzr3/k36CMVbGSmfLHH/0TrvuW1+PbDZzCoWkVWFhoUaCNja3a\n1OU3qZASDNqAbVnsme9UokPfQUoL0Z5HCInxQ158/AF830KVEaMUnLHktne8C5zK+3Dm4GVISpRt\nM9zahDu+PoNrw9LBo0h1hFLV2HVUokWOzEvuvesrvGH/USxHUQqbPQeOEYYuRX9Arg2ubdPtj7j/\n3q/yttu/DTGJyI0mbNQIPZ/tzQ0Su8RtNzmx/CKthUsx5RBBQtLv4WiBMCW6KFDSkEwjfCdgWFQn\nqmVZ9Hp99h86zF2nH0YYw34hMCYnKwsyI1nvbuEthoRByBw1tp45xcZqj6jQxNIimSbYjRYDKZmQ\n4C8EHJxrVw4DWAgUSiksxyKJExwlcVwPaaDVbHPt26/ij37/EbZ2nebSQ03uf26FO/8i4tf+/Ztp\nr53gxItdFmeP4Ng9dLbJb334CR5/PMPIBlGikFJTmMpGwrUlSVkQT3OO7tnLl+64ixve+AZC14NU\n8/SX7uPiq67G3W0xsiWtZpNCGvzCoXtm7YKdIfUUkU3IiybG8pj3OxxsLEBUIIwmKjPcQtIqxxy+\n9mrWukNOfOFRptOCdsPmgXvuYbI25Q9/7aP82M9/AO0JRN3wGx/593zkt/4zX7yzT75tozKBLSwK\n2wEjkOLCFo9Xjte0gg+VpqtIItI0IRQ1hqMBM41ZslwCknPHj1NbalMIzcyuo0TxCpO1dZQTIoRD\naYWobEJaWozi7IIZHEvgeA0mkzEiHqOERvgKt+Zz3RsuwUoGFMbH0SUiz0l7XRwkKstZH20wn4xr\n7QAAIABJREFUwsMNm2yuncGRHou7d1NkKa2ZBnkypjg7xgoLNs+dY7C9Sc0GVEng1imLgrAekJQp\no8GASaJJsQi8GmWRU+SVbUaUZqSlxzCNsR0FCLSSRFHlrjtYTfn0A/fxxssvJgwEu64+ytyeAzz2\nzPPc+OabeGTtPLUo4/E7Ps/N9QU8z628WhwLx7XwPK+is/gOUiriOKLenmF7PeOOvzqBY+/lviee\n4mNfGaBrh/ilX3g7wzjnUx99hsuuvh2jx7zuokUef1By/Ml7+bVf/35+8Rf/EC00xlhsdRNG20Ok\nitm/7wDnls+ijeHSSy+iSDK+75d+ki99/kvkz69x/OyIq95qkfgu6uo2whRMeiN6mwOY+fq6TScD\nnHqTTFmc317GDUtcMyXPBa12m83e/8vce0ZJlpR3+k/E9emzvK+u9t3TPd57YGZgsIOVQRJiFyG0\nkkC72pU5QtJZkPT/CwkJ0AoJs7III4RnYBjHeG/b++7yJjMr/fX3xn7Imp5pVnCkZc+ZjU9VmXUq\nTmbEe+ON1/yeKk4WkBlqiysMT42y/8BBNl9yMYtHnuOTH/1rxie3UT/Z5Y6/vpfF9hxyUHDVdVfz\n337zPbzvPW+h2a7zjtf8JJo0SC2HJOq1DPyo8bIZiyY0EGqjLwXQNFzXJ18o4YUeMZKcLEDUZWBw\ninpjjahRI2vbIC0UOoZhIdIQYeroqY7otM6ZQ0mNwIvA83EyJkEgoJOAndJpR5QGxtCkQxR00aVP\nhE82n+O5R55l257dDJZGeeLAcYTSGBgfwotiYgRz80scPnGCnRMzCM3AVBqOk0FXvWJBJ5uhE3Zx\nvQgvCtENia00CFPiJKHTbQECK1+kUmmgzJTVegchTDK2jvIEmbzOtNMHCx1ONVp868gh3v/rv8KA\nYyIMh3wu5ZTuURUpD913P9s2bd84RSw0o6ezlc/nCCMfzdbJOA4xPRBpFHXI5iVVV8fp6+AYWX7j\nnW9ET9bYs7fDqeXrCNQ/Ue/MYRtlqlXJpdddwbvDgJNHDXbv3c6zz82hVIpuFJnasYfm6izdZoUU\nyOiSQl8ew87zuT/7ND/75x/CFDk+8qu/wVXlEpXaKlEY4KoQowuY527SNuAjeGL+KPNejasuuYDO\nqWUcLYNj5wmDFNdWxHpKp9Vkdl+FnVdczKnTczRbbZSK+au//XWiNKZYKqCE4tiJeSorVfrzFZIg\nYNOWLSSa6KluygxhHPcgrD9ivGzGkqQJhiaRms788jz5XI5Oy2VheQlNpaSagbBshnZfy3wClW6A\n6YacmZ+llM2xXKvgdlokBCSxIk4k45OT58zxxx/+CGMjg0iV0G5VSdKQvr5h+odGadYbmKZNcWCE\nRqPKwWefYWpsCiP0mJrZzuH9xxia8Xnbm1+D7ntUax32HzvNviee5C1veR0zu3Zh5IsU+kpcn7dQ\nmsFapcZIfwHPb6F0g3a7hQZ47TYAjuGwUmtRKJY5NXeSJFbMz87juh5uENPwQ2xDYhkWCoFlSVYL\n8JPXXs3+ux7n+T/8PO1xnfzQKIvVddx6F+Fq7CTLpsExwqQCUiB0SeL7uG6boZFhgjTGNExEKvBF\njApCdDNPJ9Bod7PAGJr1OjLZ/azN+7TaX2F68jxmxid5dt/z+JHJ/OnvcO/dj7N4SifVIuJI66Eo\nSPj63ffxivO34hgS33cpDpQxdcnVN13H1IXX8unf/yMuuOxKrr34Ak7NHuX+b93Be6+6jCI23/7i\nP2A751b7xlJnpVbFMwXDfZOsLtcxg4SBySKVMMDI5ThVX8MREpEqrKzD8/c+zC//7m9y1x3f5+hz\nB/jQL/8Gr3j1zVx6zWV86QtfQ3mCTaND/MEf/gXvfNfP8MiD9/Povsd59TXXERsFulEb0h9tLC9b\nuUsraJPRdFq1KjZdkjAg6Lbx/JAo1jh0dJbrr7wcjF6nW+L7SBJSx+HUycP05/LQ9eloEr/VJU16\n4gkXPPqWs/O033OAk4cPMrN5BqRCTxIiFGamF0IN/YAkAStf4t7v3sHOkX4GLQsBxGFCN2yTMQ2I\nYk4ur6OPjbNaXWPv9kkyFkRuD/IT+AHdbpexkQGEChGJS+B2SLwuie+Rhm0kkrzlML+wCmicOTZP\no91lfnGVuqtwXY+rN5W5aNMEpm4TpApdM1hqh1ixxlhumPWFGq2nj1HperQUeAYYEyOYE5Os6T62\nHtGXs4hVhCt8UBGFUpFMIU8UxeimQ8dvYts5MkWDoZ1lPvO5e7n08pt4w8/8ImVzEMu0eN8vXc5F\n2y7kW99/gAsvuYL9h06wcnIBP5AErkmc+ijVC/8rJCR1fvHNr6YsA6x8P/XKGtMTk7zy3T/HX3/w\nz/jA7/0+//Trv8Po9AQLScCpp4/wqrf+NLXVFkGtTWa4n5/K/bez6/YTX7iUtK/M0fkzWPk8U5PT\n7NmylccfeBDP64JhEyooZzJYSHQFIkoZ3TLFT//G+3n/u9/HpUOTGIZNIARTW/by+GMPMDCS5R3v\n+mkm9mxneus4RrZIt7nOra/7T6yvR2jpGQ4ff+7/vNxFCDFJj/o1RK9L59NKqU/8uKg82zAR9ORq\nZBoRuB0cS8MwNYSWQ5+X4NVJugGGVcRdX0fgYuSH6Ha7DOWzWDJBdVtkdYkwNLL5c9U5rG6N6cEi\nQvU4IEbWhjBEhhEZ08C2EogTnn/2MYp9/eQHBnHbNQqmQ6vuovwQX7nMnV5kYmqUzGCJRLXQVYLv\nCXTNpra4ytrqKsMjfaRhSCpCMlKiNA10k3W/heXkif0YP7WIUoNmrUbkx9SrDQIvRAgLaRhEqhfB\nG+63iCMXQ5n0WRqJAR2tiz2dwx3Yg93xIYGcFLhAIw2RoY9hm6iN4s1Abij1b7Abc7kcbbeDbVto\nmk3qwuyh04wPlJicKDJ3dJHbn/rvvPu9O7ngQotXvuI9fOO7T/Klz92B0Bw0HKIgRUkXpcTZfi1B\nRGLmOLm4xnU7JzBIyOk6/soqX/rInzHgx/ztL72PWnOduXYV37G56ZXX8vx930FIA0ca1MJl2PXi\nuknLptNo0+84ZAfK6LbOV++6k8u37SYbenhel1jPsbi0gDBNPKEwLJ2FhVn+9Lc+RE5qrIUtiqZD\nFIQ8/NhdmCLhoquu4fp3vKYXUhYpoYzJZWx+5ufeyue++DhBmIfjz/1QW/i3hI4j4D8rpc4DrgR+\nWQixixdReduBezZ+5wdQea8BPil6zQfnDJWExIDp5NF0EydXQDcdzEwRra+PyYlJjKxB7Me0fJfy\n1BTCHCBMIqLKOnG7i9I1QlIy2R5fo/uDioK5HAdOztJ2ffxuSKXdpu6GdJOUmhdxZrHJnQ88zeim\nGUYG+jk2X+HpM2vcd3KWg7U1npxd5uFj89TjkKdnZ6nUKjSbHZ48epwDp05x+NAxnj10hNNra6y2\nukSJIvFjBCClSacTkcmUcEOwnDwnT50gcNtUl9eIkxBDSob7SgwM9rNeb2KnBoHvUVtvomOQpgLd\nBNMQRH5I5IeEXhc/COj4ITU/ptr2CboxQpoYhkMiUmI6pEEXO+MQxL3OznrbxTItVAqWLZDY1PyE\nkyfnyaTLrK6e4Wd+4ePoxgf4u88ewrZytLshOiYE9KRSCSGRJKkiUhAqiJQkCOCeJ55joelx4uRx\nUi0mtXWGYh2jXsO0MiyjsW3HBfj1Nvc98DjP7D9KvbHGeqtBvfUD+TFLoxO3aQRdjp86zvLiCu/6\n+ffRDUOabRc90TBUyNj0JLHSyRhFwm5MaWorp8+cwdAEKQmza8ssVmYx4hZbto/z9nf/JJqhekqm\nwsQkAV3j53/uNcSdA8Ttkz/SEP4twuArwMrGzx0hxGF6OIkfC5UXRz1SrReEnDx0knplBaIAwzYw\nnSw7d++h3WpzbH6ZdtjB9xMEGhMzO4jLExxa6/YUEA0T0emQxAn6DxyeXT3H3qtvxPd9YhIKtoZm\nWkR+TGNpBdfzefXrXk8axwyMZNh+sUVYb9LtNFlammPbOy7ke9/6Flfu3UlBs/CSmLHJCYyMQ1Bv\n01pY5vztu0DXOHxgP2mYEHsu7RjarSZGItBFj5TVbrdJ44Rus9eM5QYBmVwWqVvMnamgSRNhmQiz\nxxRpJjGFfB6ZSAzTxDRs2i0PkejoQpImCWmUIGKJMCSWZaIZGrX6KtmciWkaPRX9UnmDMKYRBgFO\nLoOpSyLZIXQdrr9qinJfg8svey1JKkhTnS3Te3jljddi6g5prPXQD6RnVViUEIgeUw+kxCFFCZuv\n3P80m/vKXGkXiRyH5bqLM7gJLWpyZf8U/rF5BvNlvG6byYECjgzBKbPSPNdYTi7O4ScxOcdh1+AQ\noZ3lu9/4Km4acvUVl/H9r3+TicEB+h2L0f4isZXn0PISteeOk82WqTVqhEGAYWUYmBrCzJp88DMf\nwynliEkxZQ8FjoIkURimQckQNNxzmwf/3cby0rHBlryIHub0x0Ll6YaDVArbNtlz/l689REC36VU\nyhOqlKRTx7Eddl91EaaQhK0WKo4gjpClHO0uqMDHKRbxui7FQj8nT505Z47ueo3y8Chp6GMaWRxH\n4a42OLTvMFvP28VAX4H5o0coOFlIJI5pcmzf8/jtNjM7t6P8kLwQmGFAPfDIOA5HjxwnXyhSW12n\nP1dg6fAJUqnRaQcszS9jGyGarlBxQrnYR7fTIQ1jhEqQaa+/RKmEOJUo3WHf04ewjD5EN0UKDdM0\ne7zFKGatWsUuZMlmoJA30fSYYt4iDRStsIWtgWXqiHwGLS9ZWj6F1AOEmScJFHbGotvtohsWmXyO\nvJMjkZJsIc/6siCfJmyauJiJmctpd0LKgw5SSI4cXsEy8sRx2nO5hNYTxJMv6nK9QNhKlEIqSSgU\nDReO+h5vuWIvh44cxCoUOTp3BiPqcNPUJpqVFnqmj35lIpMmpi6ZiyP2nTx+zrpZls1IsUjWsHCU\nJAhjkiDhogsvQUQWo0OTDOYytDrrrLXO4KcmN7/yNaSexHVrLK0eoVJbYmggy7v+y/t41WtfTUSP\nIqZvlNv0PkQvSEEKlbVlhPzRumH/5gy+ECIHfAX4gFKq/dL3Nhgs/z5UnuqBMTUJYeoRxC6q3WR9\ndhbL77A+dwJHS8Cw0cKYomUj0ohSIYdAMTo8hIagW2lQsBxymQxheG40Y2R8DH99laxMMKIOK8uL\nnDh+gitvfgVnlhdZmD2FnkI5l6VdWeaxJx8iO1zEy/RE9NLEI3UE+f4iw4NjrM6vYGgmepiyadMW\nFpfWqHVbHDxxmlYY4aUp45ummZiYoH9gCNO0e2ILKkFDRxdaL78SWzRaCc/tO03YBsvXKSUWUb3T\nw11rBqZtYdoW7bbLysoyS8vzIEKiuIOKWxh6hGlECNUB0aXeXEKzQTNkjzti9JALmtTI53JoSKTU\nECgGiwNMD+/gor0NDDGEF2yhUCyiSw1ESuinpLFBmmiA9r8tn0h7hqJET0shSRJIIE4idBSRGzBW\n7KdfN7jwvF1UPMHDzTr3rS1zJl/ieSWoDA9wd6XKwdUVBkbOrcka6SvTZ+UwE0nbsFlzQ2589euR\nRo5v3Pd9Jq66BDkzzIFKAz2fZ3JgkPl9+xFRxMrSGSwNtu6c4VsP38UNt9yI73v0lOoUQkniVJCg\nECSkCr7ylW8iNPssFfqHjX/TySKEMOgZyj8qpV6gfP1YqLw/+IMPb/wkufyCHdx42W6M4T7a7Q4r\nzQa5TdOsNKrY+TK6YdJpt7CKfaw3mlgZh47rYQiN8kCRTttlzXM5cPzcJ9QTDz5Ao9kljhR+pYrf\nWac8OsL6ffcxf2aWLdu2Umkvcv8jDzE6PMSmndsY7BvsNWE5FlGrzVU33Yqna6ggZFCbYDxXZPnA\ncYxSnkoc4zZcpITxmUnO27GJ+ZMH2TxeQqSSUKUYhiCODWprFaIYQi+BVKOV6FQqLjNYUGkQGwlr\nR+eYnBkgThS6pmNksxQtmyDq3VMaLR8vBTcS+EKSmFnsgUEilZAEAVY2i+e6pImBrgk0aWPrDijZ\nK6jEwPd81hcXCTpLhKsapU1NMqbAtARpGiLQMWyJ68UooZGotIeSQGy4YT30NwrUxiNaSYH2Aqkr\na6DikOFNk3z7njuZ2b2DkXKJuJtSKhTx6lVaYcSjC2fwgy4Cg02DxXPWLQoC3FRnZHorh06c4id+\n6qc4cvAQTz37PG99wxtYWlnl2PEVbn3t65h9/hB5K09b+JyaP0B/wWFgz3Y+/LE/ItESdNWDQCUK\nEiFRQkNHEcYpUsIv/Idf5P57DxAECwh+zHIXIYQA/idwSCn1sZe89WOh8n739z7Yk7FJYX31GJHy\nqdbXcbJlTCdP2GyTKh1NE/gWWImNLBTI2VlM2+Lphx5k77ZdVOs1+rZuQ6UJbx2Y4Bf+5c/OznHJ\nzDS+Sjl0+DC7LrwBSOiGIV6UMDUwiK1ZLNQbvOKiy9DyJkurSwyXC5TKDiJMaK6tsDx/hkypwHOP\nPcP2zVOMj0/TWa1z8OlDxJ2EnGaAhGSlyoNHjjEyWGA1ibAdg1gqlCXodEPCSFFdapL6GvX5KtWV\nLoQQJl1GcZhHMOea7CGPKROEiNCQpEpg2DlSI0HTNfSo18Js6yYiYyNMA89PiDWLrOWgmz3dMqHJ\nHq8kTZFaT83eyhTID49hqIiVbpvhqVFayiDWQPYC83TazZ7CZNpD/ynVc1/gRddLyBf/d49BCQhI\nFSyvVRgeGqGyXmHz1DT3PfIQb3vlLXz37jvZfN5OTlbqlIcGWTx6hDfccB3zyys9yvNLRn58jPzg\nME8+f4g3vuEtPP7YYyzOL/DTP/k2nnzmGdbWarz6lpt56rnnWOm67B0cQsUd9FSxd/s2Lrv0AmYf\nf5Zt115Mokl00ZNo1UWCFnvIVGIKk/f97LvZ9/wC5ZHzSVsheuKxuv7D+/D/LSfLNcDPAPuEEC8A\n936bHxOVJ0XPTZAIIiUo9Q/TDRIy2RzCC1C2hZaoXmhyvU3b9ciRkCodXySkiU/sd8lZWdKOR4oi\nCs5FTnRqDY7MnWbvhRegghglUrQkJqNSFuaXKOWLbJuZxkgVXhAiRY+/bmg6QRwRuB7HDh7Bth0u\n3bOXMAl49rl9eJ2QOO4JyKWeTy6bpeMHGPk8y80uHT/ivPOm6VaX6LY92m0XrS1wYofKYgVd66Ng\nmlTSEF+TmJ5BmI3QMkWe2bfAhReNYyqFrnQ0Erw44cTcKok0GevPo9sWUhg9ZX9doqSJMDK4QU8w\nHCGJUokgxbYtdKmx++KLSInQ+oZwa6vsHM4jDYOynkEmOik9bHa+kCFJEuJUbRC9Xvw+XwATJRui\n32dFuFOFEi+AV3UqeNSrNYqGw9UXXcr8mVmyA2WePbifPRddRd/kBPuffYK1tSrFfIFa99yq47qX\nUJlb5M1vfyvPP3OY6lqFN73pTRzYt5/a2io333Qz9z/4EErBza+9ldljx6h3Am7ZtZ0xIXAf20ft\n+WNsmtkME2V0CSI2iLo+tYUzpGmR3/rwh9j3zH5iPU9qNknjCEv/MeVblVIP8cPvNv/HqDxBTxxN\naYr+vnFSoRiZ2kOztoApfJychVdvIVo9v9jUDDqNOjGKktGPZRoYBjhhRBiGCCmw7HNhRsuVNc47\nbw+dbkSn3SLv2Hhul33PH2B6eitRGrOyutoTeNB11qpNGuuHSFNYW6vQbrfpK+Q4b/suVtpVjh49\nSqk4yNJ6g1a9Qao0TE1S9WOCoEOcJKBpTA9PU39kP6OGTjYIOXRoH53VDpvHttBccak15gm0TK/0\nRZlEKPJZk4JyWFytsy0axDJBpD1obCpTxiamiIVEBT6kGolmgpS0Qp9my2NobJB20ETaFs1GjZJp\n4Me98px6pUZp//NMFQvUT89jWDoWMdV6lbg0ysj0lRtrAiqN0DSJpiniNEHQSzz2qsR71eIIQUqP\n9KVLiRIKJRQo0ITO0eVZJjMOsRsyYGbwCgIaFYQQ3HjDDax5Xa67+WZUZZWOHzI4M81LY0JOLscV\nl1/K448+gcLguuuv4zvfuR0p4c1vvo1vfPObWFaGXVs389A991Jzu0wPjjE2Ncry4hz7jx/kwrFJ\nxP/8PJf9/geIlEIGHk/f9yi33/49HnjmJJ1gCDN/Cboy8RXYfZcglQf88PDxy1fuouIeo1CBZmQR\nG7jlMyeOIbrrZzPsnUYTBExunmFteQUtUWQLeQb6+ml0XDBBcxxqqxXc7rk+56bLLiPutslnNYpD\n/VQWlsnlc9z8xtfRjgRZKfA6LbLlPLFUDHghj97/GO16ryJgfNMEadDCtEMaJ2qUs2WaLZd6p4HQ\ndXKWTqGYZ6G6SE5ohAqc1CaqLGI3dAqqwJFnnkSYCYOFcY4eXSVOdCZkBmHqVFyNpmbjpV2sjuRg\nWmUosnj4iRNcdcluCiRIAUmsoWmC2PcI0xiBINV6escJCfvufxykpDg1wPDENHa5RCACMghaVR+J\nwlAabT8g1zdELe1QLvURrjdJpUmi9y4hCsXsmfke+k4YpELxQuxGSSBVpPQMQwFCCoIk7hHZ1Isn\nzdzcLFMze6l2O+SjDc3jMGG0b4Bvfv2rGANFMlmHN77rXZw6eYbyQD+fXfjS2XW74orLufPOO+kf\nGOKWW27lC1/4JxApv/De9/KJT3yM4eERdu/ezZ3fv5u+7CCXXv8q4rk59FbAsbUqTr7M6bDDptUq\nutIghk/+xR/ynW/sZ3Z1CSkUQbBIbMbY1gyR8lFWBo3/R9uK333ba8hkbfpLBXbtmOHyyy6lr6+f\nLUNlEldH6gaWYdKMAlI/oC9fwokVZgqa1OhW6iBbGLZFPVhgfOsMDe1c3zetVvE7bdYqFdqdNjtn\nNtFYXES1qswtrTFQ7keLFetzEbqV56lnDtL2AuxcActOOW9qjMZiwtL+Qxi+R6E4RLXSxEw0DBHj\nuR5J1MFtthgvjREGHhNWP1qtQWnRpdWZxdNyrFYq1JfnUANZ2q0GTijwwhBjYAfXv/2dnPrMRyn4\nLi0Zsc1w6bb7eexMjUvHs2SdLIqUOIpQukUSS4QySYSBxO4pxYeKkq7ROlVBtWCp3eDqV11PR/pY\nmsm26WnsqWEWqlXKiaAaxEz29dG/UzI4tRsDrWeAqudK6bpOEis0BElPgRy1YQxi4yIvhSBNUgzd\nOMtikUCUpjx34ghhtY0qFWiuVdB1kEJwwe7zuff+h/mL3/wMDz75KHd/726kbrD/wAHY+uK6ffnL\n/8L09AyvfvWt/PM/f5k0hdtuu42Pf/wT5HI5zj//fO644w5GRyfYOrGZB+65gxunJ6m7HrWuYnzP\npahShm71BCptEnd13v1rv8m9D38AqzKHLtpIGWwIww9CnKCpWf43Of8fGC+bsbzh/POwHYfpLRNs\n2jmJbmWJSUi8JlIpdCVQcYoRK3zPIzVNtDAATSPwfGxbw++45CydTJJy5pnnGBo9V7DC67SYP32K\noaFhCo5Nt9uhkC/ieh6dVoeRyWkymsXK8iJHDx0gkykiwwY7t20jjWp4ccDCeo3ztu1As3Pcfd8D\nuDHkSmW0NCH0WujdkIutcaKjFY4fOQbjU2zuH+HB5WOsNToY41PM5TJU/BCZQktTGKaGEbZRhiQJ\nBF1bUOwmTOtF5uIuO4g4td7icCbPJC6WZqClvUt6DD2RbWmQoBGYJo00pahZyBBEu8WoZfPovY8w\nMDXMFa+4nkaYkB8eora4Qqi18bsV5g6FNFou623Jzqu30uuMFjQaDVSqEBtCJ0LRm1HXiaLeyf3C\nCQKcJRQr1fMSpCbopBEXX3QxCo3czG6qrSpxFKK5ETMjQ5w4coirr7yaJ3mS6nqDXbv28lf1T59d\nt1JxgDe98S188YtfIEkSXvWqV/HVr34V27Z5wxvewBe/+EWmpzcxMjLJ3XffT3a0jFUcZSlTYuyi\na0iyOXI7dlDeeym6brFyapaBi6f5nd97Px/4j/+Jrt9Eymwvj6RlCFVALgUhfrS6y8tmLK7qcMMr\nbiCj6WSUTr3pk6qY4vAQXuCBLtBiRTlf4qkjxzCkRdeLCFRAGsekfoAXBSy6XebmZynmijgj51Yd\nrwYJo3suxjAlKklRcYwbuwxu2Uph+y6U0njqocc4PT+HkBYLp05xwZ4d+FGLq666gq7bZmr3btAt\n/ubjn0W3TLaft5UTR46RDWM8v80uZ4KlB/aRrFaYsC2qyzVONFt0gohte3ZTc310vcBY4hMbJt5w\nP2mcoq0pmisnefAfPsJ46hGkCYktWIwE2W6b4Y7Oieo6IskyXIKMrhCpIJE6qaGRCIWB4M7HnqBg\n9ORtZ42YrKuzGAe4mqR5eokoehjbgltvuJkt5+9GVFfJGH3ots5616NUGjwb8UpVyv4DzyGERip6\njhlaTyY1SZKzXMYXDEVskInPptlkL5wcJgnDE2PMP3mIRjlHUbPxlGS5WmFkZIwTJ08xV6uxa8du\nBoZdWq1zWyve+rY38+V/+RKmpXPlNVdy+/e+i5XNccWVl/H3X/hHpiZnGJmY5MF7H+SCyy9kZMs2\nnnryEB09oq88ybBjsF5b43CpxsXqKnJ9RUQKm2cmCYMOlrLxdYUgILJNRLSMJCBNzqUw/OB42Yzl\n6tveSloo0ZUxzTTAkylJGCLioEf29V3CZpviZJ6tl1+EUDCzZYooSgiiiKDdwk4lYRCx98KLQUoM\n24YHX5xjfGqK4kCRp+5/kK1j4wRJSD6boVtv0mw0ObT/OIEfUDQzJEJjxWsTug2cPpNH7rmHTCZH\nrdll//4j2Nks+WKBw0eO0O52cVId69AiK26VjJahvHUrx9dXmNq9i5MPPUjGj6ntPwqmIMxKMsLA\n8CPK9QZ5I8UPY/IqQaUxkVBo6PiuRzFfQCNk3CywEAQstnViA/o1Sd6wUVYva55KgTIMDp44znmy\nnwEnz8FOh7oOkdR7PT5Zi+Pzs2zt7+fX/uOv8Wef+jCNQpPc+AxF02Riu4VyJhCGAULgXQRUAAAg\nAElEQVQhleDE8VMEYUgqjV7kayNj/9LTBF4SRj77ugJErywG8JOI4WyRuW6bJE2Rpka5VGbf8SNc\n8Irr6Rsf5dSJU5w6fRohxTmZuW984xvk83luvPFGvvq1rzExMcmOnbv4xjdvZ9OmXWzZspW77rqH\nq258BUoTfO3r3+HCi6/nuquuYSBv871v/wsPfvsBnnziXn765tv43Pe+RpAm3P6tu0CmpJFLTISU\noKUaegJoCsT/o/0s/Zl+VCLIWRm6fhvH0XEGSiStDpptE1kpWUcRLS2hBKg4IbAsoo6Hnskxd3qO\nbLlM3+AgURACitnTcz8wS0xtaYm8JhHdDtm8ReS6LC+tcvLELGEIaap6eQiVksYhWzZtIgo7DA2N\nsrpS5cTRM6SJIJMr0nEDpG6BDPC8iL3mMKNjExyenePpM7OYU2Pc9+yz5KOIvK7RXl9jsK+fLRXF\nnGyjDfcRazFnoogop2EJHdEK0USv/spAMLZ5C6snD6N31kgaIZ3RQezAJiIltiJM3cTSTGJl8vTJ\nUwg0Ei3CcGPKymBVBAw5/az7LplSgUhJup0OohPyp+/9IO/+8Hvptmp0/TbCsilO5sgUxkiVQhOC\nzVs29+q+4JxT5IXxQvj4hfdfMJaNQwZJL3y8uLbEAJBxHLp+F6/TJpfJMjYxzqc+9Sl+4f2/guM4\n3HDD9URxzMcXPn52jvPPP5+JiQm+853vcMHe85menuEfP/95zr/gfMZGxrnzrnu5/vobqfstnnrs\nEG972zvpHxrgmUfv5rFHv0feCCiaBn/3t19m84UX0ai3kLbNx/78sySYJBL8SGJt5IZUEpEInX+l\n3vec8bIZi2nTo9ligg+ZUoaW2yQrBYmWIhMNO5fBX21hWTqRSEhjRbu6wlK9xd4rr0PP5vFbLXTL\nYG5uluGBoXPm8GsBucESrUhRNiyatSbHj5ykUfdQiYafeiRpQuAH6IZNrjzAvQ88jGXptLsdhLTx\nQoVu2iwur6GAgBiZQCMIaJuC4nCJsq4olRz6d+7kxHdX6LdLNNwWxvgwZ1pNbByyGPg1D83OccJt\nM6PnGBnqpxYvkgY9RomhS5YqNdZa62wmy57EYFkXeEGEZuo0Ep9MbCC1HpPmwecPYWPT11/G8GzG\nug4Hkxae7yMsk2i9hSU0QgFOqcApz+P2L32bbVecRzkLW7aO9Orw5EbEC0VtrY4mNcK0FyLmJYYB\nLxa+KICNkpcehqz3t6lSiDjhoeefZVd2mCNL8xSKOWLPI0kUQpe43Q6f+dSn+aVfeh9Hjx7D9Vx4\niYrVmdOnOXHiOG94/esIuj7f+PrXufWW1+D0Ffja17/GtddeiVIxB585ym2vv400aPJ3n/4M0mvS\nlzfouoJuO+VPPvrH3HrzTXz2U//Ipa+6gZk9l3D86PNEfoY4ikmSDkJIIi0iTiRCnduW/oPjZTOW\nzso6MmNxqrKOYdu0ltcI05BWHBDGIVI36NSqqDSl3W5hOg7V1TVsq8Tk5vOIzQJekFKttsiV82y+\n+go03YDvvTiHuamPXLHMFYO30G13Ofj9u7GLg1hpl067i4jBEDHlgRLn7djKyPAAs8urNJpNlhaW\nCaKQLBqV6jpWXicrM/hJzFgs6X92icGbXsl8u4UIQkqjI3gyYaJ/gFHPozBYpBb6YDosdlvMdjxm\nxjdj1tYZ0E3qzQ5Dhs2IIVkMDeoiwuoG+IsLzGSH0dseoypGWws4XhIkfUWyiU6o+TQSxVojpmQ5\nBH6HOBJ0tYRRHB6yFGOyhxW1NJtMIcdSfY1WVjE8Msahp9fITEyx883Xofc5DBRmaKsMSgk0afDI\no48jDAMVRGfvKcCGc9Vzx1LFOZGjF5j3aboBiJWC1bVl3vnaG5ECzGyWvNTwfJ9W5FLMZTlVW+X+\n++9natMkJOdu0r3nX8SmLVtYW1vl9tu/w+XXXENpaJDPf/7zXHrZVWRzg3z3zge49bW38vA932Jt\n9gjn75zBTfOcnF8mUL2wtqlgpVpjbOsWHn7oPv7gD9/HJ/7yb7j3+yfQpIGuN0lDC61wOYkRkU1D\nYP6H7tmXzVjyWydxhvoYVgK8CBwb5XcJwt5FX+oGke+jN0PWlpaorFe4/OorKGXzSCVxwzpz83NM\nTE9TtG3ckyfodIJz5ijEGp2FVfY/vZ92vYkbhpi6gaYEUimSyGVidJhivkAQuRw5fZy5+RWipKdA\nn6gUP4golUoYmoFSKX1OhlOPPM3oyCDPHtpHenoJGYWo8VH2PbOMXK5wMugSI5gamSQjLAI0hkbG\nCVSCHidMWBnODDmcWlthUyaHiEOk8pG6RRrGZIayVJpNxhwTvV0nUyqT5vuZ90KcAFSYcHxxGV0a\n2KZNwUuptJbZddWruX5slMWH78R2dPTUoK+/j/lGhTSMMAUUL7iQex54lvF8ltyU5NJXTiIy5Z4p\npCm2ZeHHCUopNK2ndpJu3E/OumAvNRQ4x1WTGyHoju9RGuonPhCQtnRavktoxsRAs+mSej5PPXg/\na2s7MR37HMGKO+78NjNbtuD7ITfdcguGZfO5L3yByy+/mh1bt/HPX/sK115zAV//+4+h63D1tVdy\n5Ohh1upNlNSIlYIowtI1nnrySW59/a34UUqjVuWjf/BbvPtnf4XnnjpGID2ENkIkGhT1MrbxrzZI\nvuSzvUzDSFKSk4t0DhxnfWGe+qmTpK0u6fI6mU6IUe9g1js01ytUqqvsOX8PhVwR2zZZWJ6n22py\n4d492IaO22qihT5TwyPnzJF6MYeePUi1UsMLYqS08cOeQmQY9bSINcui1W0yP7/AieNzdLsurhsQ\nxjFBGBKGIUEY4DabdFt1xOIK5zUjMqdXuLpvlCujLOPdiB2JzkTF5YJI5xX6EDuFzeryPEerc3hS\nMrB5hv7+AdyogxA9QrELJJrOthvfROmSm0E6lKWBpduUJiaYCwIGurApEAxFJjKUrHcTIqWRJClx\nklKWGiOWYHhijNmDz3PTpm2sr62yOrdALARaKY8bxoz3D+HWW+w/fpxyaZT5fWcoO33EcLaOLE4S\n6tUa0PPA0rTHs1f/yiVfKdW7s2y8/gLuQxO9gstQKrLFAlsnpinlHRLDQCofoRoUR0y0rCCwI3wS\nWs1zitj57d/6ILe85lauv+EGWu0W9z/wANs2b+Hqay7ny1/7IuuVeb7xlX9gqN/Btk0eevIplpoe\noTDwk4Q4jkiJCOIIUzf40O/+dy4+/zwa9QZSSv7mbz/KyACYWoLlgJZWCJrHefubL/6Re/blU3cx\nBc31NiXTwU57RWTh2jqmJWl266BJ1haWyBf72bl3D8uLy/QPDDJ3aoHJiRniMGXpzArK1Ak8D6/T\nIlk5NwT51FNPsbi8jOuG+GGCbjskKqZYzDE8MInfDmj4EQJBmIheB2DayxVk81lMx8QMFbqhI4Mu\n4+N9hI8e5qJEJ2q3SZKI7MAQ81oXlZWksYflhdhmyGTWIOfFxD58W0YsrCyjuSHl/n7W/RZZ12LT\nzl1Ujh9nYuJCzMGtnHz8QfpNjdZKhexQmTqwnNGYr1RYcV1Gdl+KF6wgg4gEjSiNIIkY1S1WbZvN\nK2s89z8+zohI6eg6zWaLh59+Dt3JEIcxieexeWqctVPHOJFxuKh1KYZVIBEmkCKkTtdtb5wgL/ki\nN8jQKk3PultncytpghTnPpGFUiSa4P5HH2I6W6Ytm/QXs4wWLS7bMUGz5bOwfQf7FxbYv9hC/UDX\n3uOPPk/Hd5mbP43XbjIwMMyb3/FWPvHJjzI7ewhT6DiayVI7IkwVqaaTJOB7XQxDAr3cj8BESslA\nocTPv/Od3H7H7Tz8yONce/01fPLvP8VtP/GLqNjGUQkq6nLTTZfCB3/4nn3ZjOWpR5/Ea3fR4xSl\naeiORRomSCGxC1mSIGR88wxkcrTbHbqdmEwmYnjzdoLAJTPch+MYBM0O/UMjZMam8JMYXsR80K1V\niVpdlJL0FQvouoHvu2Q08Bo1hNSIo4Qkigldj26rie3Y2KZOFHpowqSgNNI4IJ+36DTWGBodINMs\nUfFP0aq1obsO602SiWESXSLLDrbQMAoZ3G5AaKRMKZ+C0DnTruPmbIz8MIMTgxw5dIB+IVh85m6G\np5bIOQ5x2KKqAmhDt5Cl1e4w0t/PcEYnJyrUtQTXVWhKUkrAFhK3G9JfGmdVVCmmPttEhqNxwqCm\n2J9o7Nl6IWdO7sdOUtxOk8H8KGGnRWd1lbDZwnLGQEmkiNB1fcMYQAjZg6SycYqoXj1YL1qmEKQI\nLUElEoHshZpViiBFpfBPd36NP/3V32S9u0ZBeLzq0m2kUQclEmYMxaAxxAUjIyy2Wjzykr3xxuuv\n4x+/9W3abZftO3eTy5r87m9/gHanjo6GkuCphCDS0DRJ4nqgUiyj148jhIVpmpi27OWmkpS8k+N9\n7/lF/uhP/j/+8q8+xcDICP//H3+I//qrf4QloVAusN750Rf8l80N233xFVx50y3sueZqLrniUnbN\nbGXrjs0U+/JYQnDJJRdjA2q9TrNaZXp6kqzjEDYa6ClErS5po03J1KHdwKusIGqr58zhNltYuiRN\nAzRLkCYxhpAEnQBCSRwloCSaZiA1EzSDVEj8RCFjejiKxCWIOnjE7LJGqT52gkaljhn4ELSxNZBJ\ngplKckFKth2Rtnxa7Q5CNzB0B1/XKSWSEdNifX0VAo/5o8fJ6DqOYWEvPUf77s9SjDqkusIVgsUE\nzK3noQaGWO0EtFaaLO4/wYCfoCKfGAWGoKkCPFOnVa8S6QaplOR0nZyE/MAgtkxpLM0xNTzAQCaL\nqKyTcQMG7DJf+9L30BKNVEs2tKcVtXp941K/0b+y8dAXirN5lRd4jkopVAJSGL1NKuVGj4uGEpI0\nn+W3P/s/eGZhH6+8eIR8GKDFBv2mzVA2y47hYa6ctInTc9uK/XqDd73jLUyP9nHw4LN84Qufo9Ht\nEqMTC50gAaSFrukkSQ+fbhj6RrObTi6XxbZMVJyQRiEqiRCpolmt88d/9BG8VpfnnjjOFZfs4j+8\n5zY0mefNt72Ru+6450fu2ZftZNHnVzByWTRHp9lsUDYzLM/OUXc7XHbhZayvrrG2ukbf0DATExME\nQUASx6AUgRtj6XlWaxUmRkbIlct4nQ7t1rm+73oMYRKCEiwtV9FUimHauG7QQ1GLBNJe3ZNKU4Sm\nE8QpaRzjph6Gk4U4YGa4xMjIMO3jbcJ8HwdOrOBYiiV8wKVhJZhxh8N2ipfGzCQaHaUwlIHdjYjt\nkFKxyPz8GaazGVaXKgwXc1jFLPm+Io2FZZxYRwkbjJiB4SJxqpOQEGgKwoBJKbFDhbe0gixnUWmC\nSEGbGOeRlSZ9eoQdx4xYOazIJ4fA73gMGwKnVYXAJJvJoesJvooIDZ2FVoKRLxKTAL0L/ejoGI3W\nLH4Qno0cv+ByvVDaIui5aYZu4mQMVKqjiIijnkB6ioQkJVYQonhydoET3ZjLRsuUkxghIOr6mHZK\nWFfkxblI7be9/+d5/dvfxCNP3E8UJCiVoKDXyaiZIDQiJHEYoOuCJIxJ0xjLsnA2NMiiOESmMVKB\nJiQqSUBITh0/RZomjG2+kL/8zKf5r7/yazz50B288Y1v4k/+/E9/9J79v7j//11DK9kkUuCGXRxN\n59j+QyjHIGs5LC0t4/o+ZjZLy+2yuLKK0CSaLqnWqrQ7HcanN2E6GU4320xoGdpuzPP7Tpwzhypn\nkZFDhgxGyoY8kInpxL1+mtBlfXWZKIyxnZ5sjjSNXlGgaSIdjen+MYaA0/uOsm3v1Tz87bu4RDPp\nHxnnittuxjuzRmXuDP5AH/OaoHbgGH63QWTk6TQalDIlsonOQrtKuZgjiWO6WkygEprtFtJrQ+Rh\nCBtBl5xyaHRcgjim2q6hpQljEjKxj5UqKlpCpDIYqUYgFCPbdvPI0qMMbd+Kd/AEUZKiRRHFnMHC\nyiq6obBSRZgmhKQkOiyGbcZGt5O48wRh1CuKRFKrrnHk8GGiVN8oZXkxXCw1yejoKELA4uLyxlGT\n9sLFSYTUFGojwCyUREsVEkmqUgI9z2986kt88j//PDNOigSkluJ1XZTu8/YLd/A7L7kj2UWHx++5\nF8sU+LGHNEzCVENJA032BAhV0uOIpqqnG21ZGYrFPPVGA00KhEiR9BKtIlW9xLZSjE5OILUMC6cP\n0+4Uueeeh/nIn36IM0cP818+8Gv8wxe++UP37MtmLI8dO8J5m7djxxrVWo1dl1yCR0wsBLqVIQkS\nGtV1mu1l9l5/FdFag3ptiUJmlEwmD1GK3/HIl8ucPHyUhflFVM6Al5zoy+ESQ8YwMkhQCeiaIokD\nUhSpruH7CVHikyYJvpegaRqel2CZOk5GxzEkfaUS+5/dR9wR3PtPt1OuBAwMlTFLDm2/yfqhE6x5\n6wRhg+nNm/A7Pv2dEUJd4U9Mc/+z+6hYFlmRIOKYlgq4YGyK1cY6jRQy3YScEqgkQkMjY+jU3BZG\nvoQeaYi0QyI0qmGMrTk0pUZs2iQiJFGKJ++4i1uufgV3Pfow1+w5H+3EMom0MP4Xc+8dZdlVnfv+\n1tp5n1w5dk6SWt0tlHMCkSyRg4yzwcA1F4zt966vMzb4YjDYBgw2XGwTLtEmCWGQEJIIyp1bnbur\nunKdqjr5nJ33en+cakkl23q+5g95jlFjVO1TVeuMs9fcc805v/l9xa00OI6RKmoqwo8Sdm/exsFH\nf0Lv0Ci12hJbjBLNuQqZkXEm5iP2//Bx0kgQ4wMmnI8qKAxTZ6W6jK7raIZOHCuEloKeIpRCSh0Q\nSCEQJCihSEnQpIA0xjNd3v2hz7Bx0OIXXvczlNKUYaVIHZdGfQmeQfkmXAERVIOQRKQkqQTDRdJ9\nkKkkAZUghMTSTSxdQ5eC5ZUFspkMUeBhPjXNqTAMGz+JWL95I9lsFjfj8uThU5SXF/lc9VO86y1v\npen5OG2e0543Z7nsij2ce/QQI/kSeUNx5sDjeL5HfmgAw7Do1JqYhk5OT5C+h7+yRFZFuLkelssr\ndIIIkQpOnDhMLfKwcjkW22vZ2OcXGoicg+k4aIZCodHWYo6XJ/FJcNpgpAF6qiHiGCUikjjCjG2a\nQRtVNzm32MRODAYWq1xkWgwowRMTh9m8lKNgJkTHThPKkEoUoja0mD53Bt3MoTs6Qlhcvfki7mvU\nmTNs8CJKGZOFmTI1SzGTlaQtjzFpYEkNI46RKiRXLNFMwBYRqZYwFXTIGSbj+V7GR4rMB22CVBAm\nCVmh4S0ssjnXw6mZScbDDmzYxo4/+jBf+OUbWV8oMVbsgVgyW6nguAV67CJOKYe7ssLkE/vR8xto\ni16m5xtEadplmRSrWK8kWS0jd/UZn4kFOw/nT9PzEaZbPj7fjznf0ESlJEpSt/Lsr6Qc+9TdSK/G\nQMGht1jgZ669cY2zVCoNpF2krXR0zQbdRkgDIXW6xBMgVYzrGGQzNp1mg9D3cU0Diy72U8UhQulo\nmk4q4YIdF2E5NkEQcOjgITwvQNUC9k+dwX33OxjcfDmf+fx3n3PPPm/Osjx5msHRfsJmh0alSaZQ\nYN2WLVTbTVSaUBjoI4pDND0lrlVxChnay23Ozi4yv7hIJ4hZmFuiFXoEQtBJEpppHfJPrxEFMR0j\n5lxzESljhFJ4WsRKp0Jiacw3PLaNjdGn9xA1PQxHxw8DDKlRLORJwgChYl5w0U5m/uobZNY7FPtL\nnKu0mW+FFIOYQiaDyOWIlytkswVOhx6asgiCOiLKMFdtss42SQpbCDdIVmbOoLsGvcMDzNXKDKzf\nwMLMJDk3Rxp55KMEx7JJIwOts0Jk5vFMndG+EnF9GWUUyUmLVuSRailSKlrNKkngY5QEHSkRdZ9d\nmYhLRzbjhy3i2EP5Cd6Kh6PDzNRpNhcvxg9ivvTZz/HzL349YZpQqS8iZIqBQ0KymuB3I0t3pkU+\n1V8BiZSgklVHSdWqvHq6SifWLTFrWjfZl6mH0BUygTgGLd/HZJzSaGu03LX9sUwmQxhrSMtF6HaX\n6T5SoARKpmhaV/dF11Ma9QqSiIxrkPgJYbuJYUgcyyHwIzTLYv227RiOTXlhgdnZecLAIwFyIuWW\nm27iwL7HuPa6K/naZ//uOffs81YNazYSHn/8cQ5NTHCqUePYygpPzE0zFfqcWWpSERadviLahq14\n+RIiXyKyu+U9xy6x0O7Q7LFYMTucCVdIcia5Ut+aNRSCoF5DUwFBu0WatJlfnGJpdprdWzazXK8x\n36xwYPEUM1GFpojpxAGapuEHIYVclqt3X4BtJJheRN1POasH3DK8kUVT8cChU8y1G4Reh7xhgh6S\nGeilrAIiKQg6HpplUEwSWkcPs1JbYGRgnNJgP4lKiaoNtFRgjJRQhsMVH/0Hyk4f9nyHRHjkB/uQ\nqeSGa17IbL1OEsXoaGSdHBk9i+NDIhR+zsTRJTKNqeuSnlaNb73hdfSW52k2l1FBgpGmxH6L0Otg\nFfPE2TzmwCDLcUysuqSHrUb7qc9NSYnQJEp2CSo0TUNKgVIJECPE081JXdeRsqtaJoToEpGILnXs\neTCmFAZpFJOIhFgmREGMFiviMMJP104oZq08QZRgCA0THaUMTCFxdIGja7iahp7E+K0WIklJg5jQ\n80nSEMPWyWYcOqGP3TfABXsuIQVOP3mcc1PTtIIOXhhgI9Gl5FWvfAWPPXGQk2fOYYr/opOSRCGX\nXnYFbqmE1wkxUvBVyMDWHdROTZAkEkuBV24Tpg2OnjxF6odUvTbLi2VqIqLlgMjbtIMa5XYZLQKe\nwapjKB3LskijiMD3aUcR2y6+kOYJnzOTJ7ju+l2cOzuJlCZe6tFsLpPRDDpejeFcDzt3beDw2TOo\neZ/BsWFqtkHp4l1s0CxuGCvw7TMTHKl0GI4Mcl6KMVdmPNdHZ2UWNwGZpNiFIst+m2wY4wvR5UhT\nEXZfAcc28UJFbJTwCqMs5Ddx1S/cyT2f/gThUpVNQxfwqD9LfW4G4eaptyS9uSwIHdP0yWkmyhRM\nLq1QMIoMmTrKn0eZOoZuktUEutDA0EhjhZ3NUBCCXLaP6olTbO1ErBvdSOLHREpHSoM0ASWfpoGT\nq4l+F1n8zE6lQohunnL+yNU9oqVPIZDPmwRUukp+oeiivFdfu+mGG+gbGIBntDjGB0ZwnDYr1Qap\n5iKFxLAsRNKGtNMlWiRG0ySB52Np3VzJcW00TaPpe/SOjLFu8w6mpqaYOneuSwKvUqRK6bctlITZ\n+UnWbxrhFa98OR/+q08QxM/NSPm8RZZN60eRQGepSlyt0pifJ6zViMplOp0Gse4TNFqsLJQ5c/w0\n1ZUG5eUq5cYSy0mTildncaXMxOwigRdS8+p0Ym/NGjJMadfb1KoNaq025WqdRw/sQ3MMXv7qO6g3\nyxiWTprGaLritluuxdRStl24gc3r+7BbLSzbIHtgnm0XbiMWClEqcs9DP0EdOcaIVOTWDbOUpqiR\nIr5pUdq6kWlbZz6OWI4DltIOYRjhxpCGMc5gDypNSBGYuoGMOwy1mtTLxxg9e4Qnv/wNhpMUW5M0\ntYQ+x6B97jRJo8PIxotoJ4K2SvDTiJYKkZok9RU1kbJxdBsq0RAKoiTCQYISBElEPWqSttok1SYZ\nJbACH2kqzPoKy+cmIFWs1KpoUqDrXUzYU3St6ukJwvPlY+gKzj4zeliW1VVse8asS5chRq1+3/1C\ndQ92hmly2eVX4Jj2mvv2tre8BV1FBKEHEkxToesRKq5D0kKXAWHkE4Y+SiRIHfL5LIqEVsujNDSK\n7mTZu28vp06e7HIKkCLjhF7bZePoKLVmlTf+3M/ypa99jU1bNzE1MfX/N1X8/EWWRn0FTdpUai2I\nE/p6SywvLzM/O4sczJOWbKbPzHP46GHKS0tdeHwQMFevstCsIG2Lnr5+qtUmXqvefRrG0RpAnpGY\niCgiSgVC6iSBYqlapqRnuOfuuyiXO/T39OIlbbZfuJGl+Um2DvWyWJnj4fkVRn1FXMhz1eICsx2f\nuYUptMQnKle4IVNkKIwxdwxzeGqReKbMRNFkZ884auAUuVaAE/g0ChZKNxhMHRb0CGughLNSpVpd\nwTBNOlpCfwi2DFl43++iy5DQb5OxTbSmh0h1ehBY+RwLZ05gD/ajFx1ioaCQobFc4YKBEc5Vlzm5\nPEV/bw+p76OloEUK3VSkUiKlJBsr9DDEbzZxNZsyDYbaHvvf/0Fu/cQ/0qzXkbpBnHaZXJRQpMna\nSCFWsV9qdcOfjypJ0iWt61K8rp2HkfKZE5VPY8k8z6NvYJCJo6fgGcQ8X/vnr7JlyyZ6B3o5eOA4\ndn8vkVfBkilSd/E8jyhOSYkxTZNSMU+r3ULFCabrUKt3qDXKJIFHtAoKBUVGE2QLLvsmnqQRpLz2\nNW/kxz/+Ed/+1g+wdIeE5x7+et4iS9i/gZmVOmPbN7LtpmtJexw2vfhKRq+9GHs8R2IE2Otttt94\nMaUdg/zG+/8Hw1dt4JZX3sSwW0JrKxYW5lkul6kvVAnbMUm8Nv7PR3VO1edoRnWEimi124yMjiJS\ni8ceOk7cgqXFKko3OHLmBEZ/LwfPniURklaYwMh6OuUWvct1MsemuLEu2dSO6NdN9JaHLRPq9x9g\n3a2X0lcw2SwEs3ffxXB5ieV2jam2j7cSoefzNM2UnK7TjFKCnhLS1LGNLMU6+IYijVKCdodCs82C\nVJTSDLopIQjJGwaqVmHQdClmC3RaAb4XkqYagWNxqjbHFVqRvukFwqyNGaUEIiYkZEqmXcFTkRKr\nhJZhcDby2Bc1qC54HE3r1JZnOHXwEMVMpqsTuYqtkkg0IZEIdK3Ld4xST40hC7oOEMfxqmMkWHaX\n7lWpBCGeURGjm/Rrmo4QGkppBFHM/MIy6lm5wu23v4zrLr8C6fmMFB20Tg3bMIjQqNRrhElMnCY4\nUpC1DKq1Bk0/ItAcqo2Q5aVlOu0WQXz+fabkBVy4bQszK8tEiUatsYKbtbnjjrutv1IAACAASURB\nVJfxmc/+M5Ho4PNfVFNShnWmZ87Ql9V44Ft3EfRnSSZMau0OK+UV+gYHSKVg7tgimqXx3fu+yfBY\ngbNnZ9iwfQjvyTO0Kw0ybpGVZhtNExCtfTI0y3O0OjGBHuKYGrV2Ha/h8/f/+Gle87rX4fl1itkC\no2ODTE4c5+oX3cR8vcGBM8fYfuF2HrrvJ1yu97JpZJyF0xN4vQWMUj/NdRtIF5cYLmaxGlUm952g\nIEy8hSWKyqSCxqJQVHtzHGxWKMzUySloC8m5+UV2b9nJYrNF6GRZIkRGijEgNSRJqtGQgmYaEaYh\nKyLEDtr4GZNCf4l2vUraqqK7Du24O3NiJDoTfpXLxjajRoaIK1WcwOK0CtE1i/l6m1Kc0NM/hGvZ\nbNl+ISfmzrCh3yXMBxTy6xjbsJF7vjaDlBJDF8SRQj6Vo8g1MPyn4fqrDUupoT+D0EJKSJ86ua2d\ntjwPwjRNiedF+H6KZWa602yrds8932FxcZGbbr6Br/3zN1lYWsTNurT97ghGHMdYloFu6ni+TyeI\niZTED7pc0XHcpYtSohsNpFKMjo1y7MwZEl1ybmKGu77wj9z2ytt54Id7UfYo0hpExR4w9+/v2f/c\nVv/prVOvk82VeGzfQRbbFU7NTfH1r3+Hx3+yHyUcvvylbzI3tcyxk+c4cuwUn//SP9HsRIxv2ELP\n5kFue82LiTWNZr2GmbVIVEDGXUuy97LX3khMQL3ZZKVWZ3hsgOGxfk6cPcaFF2/nL/7yj/CDGknq\nMzg6yh996M+4b//DGMUs04vz9KwbZSjQSGptjFKOpJChNb/EfKNCrVKnvVCh30sJZpaRm8c5o4cc\n6U15wvI5otqExQybL9xJqb+fbUOjXL5uI3vGRzFDn2L/KA0/oGfrFvIj4ygEMXE3x9B1VsyUnq0b\n6Vm3nnxvH8PrNuAMDaLCNoVmCyNO8aQkRSMgZS4NkSs1wqNnqNgW4S03Y+zejR1KjKxDT/8IPoJq\nrcrUqeN0FsucWphi8thp1m/aBoZOmCQM9PSRtVx0rYsPOx8ZhFCrFTH5dM6yihYWoiuTcb4C1s1d\nnj56ydVjoHyqqtYtK5umBUrj2c9sy3S54fpbmDgzzVVXXkcuV6TdCpFCR9c1DMNAKYEfJbS9kFh1\nSQDPjxQ89f7SbrvItR0Wq1UaYcThJ5/k2LFTjA8O8vhjj3Lri16EF9fRTQPhr815n23Pm7OcOnKK\nkycmmV+sMVWts1DrkLP7mDm1yP3ff4ixsc08/tghmp02hulCmuN733mMQ4eP85mv/RP7Tp+hmiga\n9TqOazE00sdv/ua71qzxM3fewk0vuxJEhCkDZsuzSEvw4EP3kxJy6vQRNm9Zz/LyIo1mleWlKkGz\nxcTjhzg3OcHs0gLZWCLSlNl2QG79KMFymb7eAhN6ypFohdOuzzFH8UjcIcwXGOkZ4UV7ruEFG7Zx\n8aYLGOjtw2u1kU2PeH6JEdtiaWWO9Ru30NfTy7nJaYI4RSqJpsASBrpmEkmN+3/yY+aXlvDaPnMz\ns0zMTFPKZdjQ00PqB7TjGNvMEGsaqphlryyTyhg7kOy6/uXsP3MS0+uQBm0ay2WiRp1sGJELA4Zc\ngzBuMWiYLBw+hheHNNtNoiDA0W1yuSz5Qu6pxPy8pWnapUpavfTMmfzzPxuG8VSC/2w7f80wDHRd\nwzJtHGctNmxxocmpk9PcfPNLmJ6e4o7bX4Fpmqt/L58qGjTbHlEKUdLt8yCezoskYGg6jmlhaAZS\nN/n+Aw+imw77Dx/GwKFU6CdNPd72c7fyF3/2Vr71pb/iuew5nUUIYQshHhVCHBBCHBVC/K/V6z1C\niHuFECeFEPcIIYrP+Jv/KYQ4JYQ4LoS47d/732GnTm/OIpd1GSz24i+uUKvXiE1JpVal2ajT21Mk\nVh0yWZ18r4sXNnn4h48ykBlBBIKk08IxLXRd8UtvvgNrZO08ywOP3ssvvvnnGNsyTHYox52/9hIy\njsXvv/13uPrqPWiOZM+uy7l0z05UDAvTZXRpoDsmBbfES6+/Dncow+FalZWkw8LKMtN+i5NFjX02\naNk+uHQHN227mJ3jm9l9w3VoIqbR7LBSq7J88iSV8jJzzTrznTae30IszjEuBNH0FFarwXrTpB22\nMA0bqWxqliDnw3LqkzXyiFix0m6TC0Li6VPozRUs18Z2DLw4xin2YUqLtOMz2U5oj+SIRcS3PvDb\nZCIPu2gxIE0MqWFo3bwhSGKEzFPXYEuhn2BxmqxwcLQUKW3QEkQUoicBgyWXQlYnSbszLVLTkCLF\nkKDRzWvS1alSqWurnMkRhtlN4p+Z1D9zeCyOUzRNZ7A/x6mza9UP7njlS2l7de69/7vcdOstRFHI\ndddeiyQliD1CFeIlwaqIhIYmJLrsEgWej3yGrmFqilJfLz6CK6++lbu+dhe6itncm6fiL3H68BE+\n+K7fJjtf5tg//xPf+8Tf/uedRSnlAzcrpfYAu4CbhRDX8VNK5AE0lms0luv4Kw1EJ2KsZwhHWGTs\nPELZnDk9RaFQor9nkKNHTjF9dp6ps/Po0qXdTNm/7yjZQg+hCbVmi6Mnz5HN5desYRo9/MMXPs+m\nPRvo3Vjk0ceOsbhSY++hfWzevImlhRrHTx1j9+6LedUdt1PM91BdarJ75x7KExNMTpzGSTX6du3g\nlJnQrHUY1HO84MKdXHrjNYxv2MBIoZd6q8qRfY8h44jq/BILE5PE9TrxShWz7ZNx84SahjAMZApW\no4E5O8OwoWPrJrkgoZkmqD1XMTPWh6trbE51RBog44hpKkRegxcoh8rCCnNLZUgjHA2WF2bQBaRC\nYDkWc9U6QQB2OyDw6tS9FpECpWsopQikotKo0w5CCm4P5baP8H00r44pJXGiKPTkGRofQ5oWqRAY\nlo0U4Ng2ru2QdTMUcjk0BIaQXWQvktAPIOnmJJZlr0YPfQ1Mpjsn0xVF6nQ6hEFAPr/2vh09epTr\nr7+eKEw5eOA4e3ZfSRxqvP2t78TRXSIvQaI/daRb3aurxOQGjuOybuN6Pv7JjxGrhI99/ONcdtUl\nWOYAH//QZ2nVJ1lX7Gf50QP0N2PcSgen0kY01hLL/185y+qbON+pMekq21TpSuR9ZvX6Z4BXrn7/\nlESeUmoSOC+R968sTFK8MCA2dWbCDpPLS5wrl1mq1DEMHdPQufnG63j9q2/HdTSkVPTku/otSRJ1\nP2gvxLB0ejJ5sjJLwRpeu4gPv/fWd9M3UODgk4dp1D1kbNKJYkhsRgeHME1JpxJw+cWX8b4/+mO+\n8vkvcuFFW3nhNddyycUXk0xW0HSbvhfs4WC9zEyjSnlqmnyuQKvRoHl2nsrZcyRnp3GaIXYiybgm\nlmngahIZROScLJgGCRIvSlGWg6EpvDimEoZkE0kto9P3q2/lllf9AipVuCQUDBNPg0A3qQtFhYTY\nsJFSRzY7pI0qMgrQUCgpSYOQUwsz1B0NbNAsiZKKjoRWGGGaJppr4WRdNu28gP6RcfKbNrHjwh1U\nZycIooBcoURjeZlCtgToRLHC90MKbhaZdimpHMtCk7JbVk4VhqZjGToiVehSe6pS1oXzP11mfnoe\nRqDrBq6boVKtMD+/Nqk++uST7Nu7l6uuvpIg7HD8xBF6+/LsfeIx3vHf/jsZO4NK1WplrYtPy2az\nlHp6GB8fZ9v2rdz5hju59vqbec97P8i3v/0gU1M1NmwcJfRr9OT7eeBzd2HWW0QrS8wuLSNViuOs\n7fc82/4j+iwS2AdsBj6hlHpSCPFTSeQB1JptIiFoRm2mwgbCsbFsi7jSAVfjVXfegSyEvPDyF/Oy\n227j45/6Bx78yUPoThaRwNj4CPsOHaDHKaCbOl/40ud4xRtesmYNL6jRpMET+x/hW1/+Ht/8wSc5\nfWKRA8cfpJAv4JouF2/ZyYc/9jGuv/kGDhzai6VJin0Fpo8eZ0fjEtZVO8xXz5C9dDtbrriK5Uf2\n0Tl0inXX9tNYqNDK2qw3cghXks92hWBzQhBmi6h2C0ybvmIvC1NnkEGClgY4vSWkkWd8+ya+v28v\nIozolRYX1CM++bFPkeoaoSawlY4RBQxi05PL0ghCcn5A2m6ipwmDmqKRprQiH1s3sc0C7XaNVrOB\ndCW+F4JhIB2JtGx0TUczDarlCieOn6Q+O828SOjPFQmShCSImQ9bFHSbickJwtVk2XHzuLqg2Wmj\n613hIil0xGp+4qSKgYEBavUG1WYdLwmQukmukKfRaAExrE5YplGIlGBZOpZrQhpQW16ATU/ft56c\nzcL8NE88ErJjxybiJOHxQyd50xt+Fr/Z4tI9e3jyxJM0PB9DN3BdF8dxGB0dJY4TUBonTk5w+uwM\nJhqbNl7AwIZRnvjRd3jNa25hONfPZ3/yQXRbMbtSQaUaOy/aSJ+7NsL9XzuL6rZv9wghCsD3hBA3\nP+t1JdbiIP7Vv/i3Lt7z8GHQJH4aY+Z13IECptBRtsYb3/ImzqycoBKYfOCj7+PWF97C6994O6Hu\nc/LULKePHmd04zp2X7GL3kKeRmeFa3Zs4WN//+dr1qg2A776jX9hbHwTj5z5Efd+/2GaVZN9B/fx\ntrf9Fvd881+49NpLeM8f/y6dJKRSWWRlZZZms8OfvPfPCcseyb5vkSnPMjEzQ+wJwmaHogThGiQJ\nKFNStGwqmmRhsYyjm4gwpiMlabtBVegMDK8jzDkUDI2+jIE50s/RekCaQJuEKdVhXSy49z3vZDxt\nMmcqhBeR1VN6pcRNFG61RSIFCoFITDQhyAgBlg6WhYwhlgK9MIDV9jnS6TBg57GEJBtrxJGPaMak\ngWRAagxpFiuWzSZDo08YTE6dYczUOetk0BKTOGyQtXKrDceISCXEiVot++okaYxmGgRhyHDfAPVa\nDTeTJUkTtFAgooCsCtg01k+PZZExbWzTIgw9NJVAEgMxk/d9ldFntc7DlTl2X7ybTRdcwOimDWRz\nBc5NzfPtu+5CtzTyg3lefemrOPrkSU6cOI1jZ3DdbJfBpVbj5S9/FbfedisZt4efPPgNAqdAeLRG\nJvEJow6//dY3s2VsjANHjyBsk1AKfjxxFp6VOz3b/sN9FqVUXQhxN3ApP6VEHsDNl2zHsW0q7Sae\nFByfn6OTauT6CrzothswD3anC295+5U8+OCjHJ88Ro0Zjp8+wq233sCxsyfJDxQQeUWtskLTk2zZ\nuAWegdL/4t/fRdbN8/Lbb2c4U+IPf+/9vOcP3g96H9/60j20vBV+5Y5f5eNf+V8cOnSUm264je//\nyzILs1V+4XVv5ZeveyF39PTQjGaR1SYZ08Yq5UnbdZaOTxJkdYY8RTXyaKQwt7CAzLo4YUJ2oA/D\nEsS5AoWNY6jp02R68sSdLjvjUq1Gp9Wm1EqRuk4n8UHzSFKD1I9Ypzm4SUyGBDNWxKnAkJJSKvCJ\nqFsKP04h0pGej60MJDH5jMuBVo3IsQhUCyPR8cIOfT09UKlhCQsDjU55miQMqCqToi7JVcqs780x\n3+jQNgNMTQAJKlEgFHEKlu2SpCnxqmYLKdhujmrgYeiQBG3yhmRdYZCirWMmXQUzmcakSZ3El+RN\np3tEtTPEcYBp2sThWkLuzz1wL4lmEKHQk5Tjh57k8ksu4k2/9PM4RRehaaQI4lDx7nf/Jnv37kfT\nJKlu0jMywOjmHoZGMtzz7W+jHJ0N4yX8Rx/jjjtfiic8XrJrD99+6CFu33MpBUPgWzHtyMewNH5w\n/D+pzyKE6ANipVRNCOEALwLew08pkQcQkNIK2nRMxWK1AbqF7wX05TJ88TtfodycpjrdJnv7z/Ca\nm3+OU+Wz8KTD9p0xmbzD6++8nUcP72VmepHqXIsjyx6nDy4+fSAElC/wkhZZN6JQENzzve/x3j/8\nXQ5NPsZHP/hloqDFb/3p/8vrX/tavv6VBzj86KfRsTB1B60pGD/dJkhiJl2Jr+n40qQWrBCHPtml\nJfxhh9q5BlGvg2u52MUcA+PjiMlZqq6DqUKmfcXSzBLlRoWjSxXypqQ/kmxqh8ykHTBlt8okbAKp\nQMbIUEMQoyUKRDeR1U2TdhIx+MY3ceRf7sZpNxFG1G0MxtC2JL5hIoOYtq2TAKGUiKQrRYhhkUgT\nTZkkSUo9aSMHh5mIIhqex7hp0qN07rzhBlTR4Z8e/BEISaxSJDoRqx15IVBpBKSkiSJFIBPI2Q79\njkWWGIEiCkP0fBHNsFCro8QJihU/QHgpWqeNoWvEvkcQru3gJ7oJSmEKgZKCi/bsYvvWrSwvTuNH\nFrliL7qVQWqKj3z0w+zbe5B3vfPdXH/ppRQKLicOHeTO172KX3rL66lUW5iNFsf9BWIr4IG//TzD\neobXXnMFiddBiZSiYSHiGBH/61L3f9hZgGHgM6t5i6QrwHrfqlzef1oiD8BxbVQa06g1SZOYIOiQ\nxhErS2VMcw8X7b6I6fwyH/3i5/iZyiKNVsREeYneQZP+kTxhGHDj5ddxvHAQ6QtkkBI+SyZPCQ3d\n1fjx/sfZdckuRoaHeXLucb7wtf/D33zww3z4kx/g3LkJDhw+TCHbwyW71hPHIQ/f/wSlnl4u71nP\n9P79rL/kEozBIvH0El7icyqukg8jBne/gCE1S9Bjka8lzJgxbqnI3EMHOUELpxVQIcaOQgaEhW2m\n7Ai68nUiEuQyWWaNlKlAYpYG0JZOEUaC2LDJEhLrOmEssFRKR8Vc8cbXs7DrJnaWCpz85N+hBTqm\n8JjVTK54129hBh3u/ZtPEJo6LhopGiYpkUyZT0J0BZ0wJDE0NMumlMliFXJYS1VymTya6fP4E9/l\nt3/nD/naQw8TewEK2Y0kSdxV/EpTNN1ACNWtwiWKjJNQsiS6UlQjgWna9Pb10GjUEGFAo94Gma4y\nr0ikUhhSYqY6nbRNxsysuW9asiqYtIpSVqSkpkaq68R+THVxiUSVcUt9IKCvr4dP/e9PcOjAflxD\nY8eOl9JcXqbY10u+N0umYNJz8XqylkuvlcMWGpVGhRNz5/DaIZfu3o2KIpT2UziLUuow8K+Yx5RS\nFX4KiTwAPU7IWxZuqoGuEzsGsYJOq0Z9pcLRs7PccsOLmDtd47vfe4A3/9rPU25X2Ta6kct3X8Yf\n/Okf8Y63/DrX7biNb339TxgfH2Npbq7r3qv2ujtfxg8ff5TJySk+/YUvMDZUwu6xEKHBn//NX7O4\nWOWFt93EuelpLrnsIjZv28wP7/8hUsUMuxnmDh+mV9c58PDDvODtv8p9P/gJQlPcdNttTM0tc/zh\nJxg2ejk3PYdfrnE4aZFKSae+SCljMIrBej+lGisGiv00luaRSJZFTD5jY7o2+XyJHZfeytDWyznw\n4XeTcqorr00eM9JJRQcVJ9hS48Evf51hvUTZKLHtze+iqeuYqcZAR7BUsehrVLDWjTK2qcgT9z/U\nBY+mEegG+UIR18mRUTqJY0AQPqUBqdKESAn6Bsc4d2Ivp4+e5R2/8Q6+8MnP0vZ8/DQmjCAMQ7LZ\nzCpETKDHAbZhMFbIkiUmigIaSUKYxrTLZSK/QxxHSM3owuq1LmpZ1w02btjI3Mw0Qdjmt//wf/D3\nBx986r7VFhdJJNi2i+VmECgsw2JgcBhDpERJgtR0pOl0dTBz3THL7RduJPYC0LvATmGaqEihNA2j\nb4ChrTu588+2sTQzz73f+Dr6coNixmbsZS9l4onH6Zza/+xtusaetw6+pgSpF1Jys/TYLpYUDPQV\nsCyT/fsOoQUWJ584Q3limdnpFX780KPUF+e4994f8JkvfJ6B/mG+8rWvUByUXH/ZHl5+241YeXPN\nGhfs3I5hxGSyOXRp8sIbbmJ0aJA3vfEX+eVX/wrr1m3hm9/8FrsuvJBXv+pVHD1xHK8Rc+0VV7BJ\nOmTbHmajQX+c8MAXv4jRadOsVWjPlVk5O8kVWi8zy4uMlJts8FOuFC57EpPLsr1c1BIk+LiWThw0\n6R8fouVKorxNX2LQq1uIlkerWiaqLJL1a1hpGz3pJwkdNr75Txh79x/QFopQ1xEipQdBvtKgkB3h\n2Lkmlpal4wva+SJuTvC9h76Hn9NoZxSRBUqTFIs9ZE0XFQSEXpNWs0alvEB9fo6oWiHtdPBadby5\nBaZPzyB8g20jm3jVK1/Jhs3rkELhWiamaVIsFmk0GoSeRxrHSKnRkwkpGAmJCkmkQogUPwrohG0i\nFRKlIUiF7VgYElxbozTQz85LrwSp2DA2gp1dC1OyTJ2saaACn8bSEgszM8ycnSAIIlqxQrMzaKbN\neRiBkHJVtk+j3el0CSzcDEopHF0jigKG1m8E20IfHmTo8l286Y9/n4tf+GImO4oTtZhfef8H6fSs\nndh8tj1vzuInEX4cEngeyvcoullyGZeBwR5edMuLWZhe4dDeg8xNTuBmLSYmppk4PoVGkQP7T9Ko\npZybavChT36Fr375e/zJ73+QrDm4Zo177/8On/rrTzMw0MsNN1/NX/zlR2k2Orzwymu46brLuOnm\ni7j+hsvRbY0//cD7yOYLHD12nMP79rJrbBtxJSAjsxREnkFNQNRhqKeXxuICnaU57Jl5qv4SI1HK\nVmExGKSsLM8zpyUI2yF1M/TaeYTnEdfrpPUWqRcSq5h2klDTUxZtjenH7+fcR98Hnkfd1FB0yOQH\niJbnSRITXxd4MqYqY7QdL6BWmSQuH+fYvh9R9yq0vQTKS4SzkxQOnWZ07ylu6h/C1LtsnSKJiBtt\n3FSgd3yy6OR0Az1OcXQNFYYMO1BAsGfLNpKZBUIR8xcf+RClYg7XtLBtmziOGRwcxLUdLNNCSUW/\nBe/75Ef42He+yY2vfQVS10g6HloUM1Lqo2BZmI5JJ+jgWia2qbO0soKVL6ALwdTiHKMX7Vhz39z+\nAlY+g2bqSAU5O8Nw/wC5fB47k+meRFZ5ms7DZ4QQxJ7CTGyqc3WqizXKM7PMHDvB2UNHqZybpb1Y\nRqkUIwYpdF7zO++if+d2cpGGTAPumV6rSfpse96cxYsUUbIKwFMaJF0N9nbN59tfvZtquUqr2SSb\ntQhXlhEtnatuugrdzTB/tsbxRydZmWiz95F9ZM0sjnCZPHJyzRrj20d41S++lqxbolNZ4jN/+0WG\nB8f426/+Hf/Ph97FHS+6g7CVMnN6iRdf/RrO7j9OwTaIgoDePbvQhgeoN5vMZzSUnadV7MceGqU1\nU+YSu0CqR9x2y0vo37aDwpaNjI6N0ghicgWTWnuZjBegqk1MPyGIEjzNZl/a4UkRczgL57wAo2eY\ndSOjhGGDpqbQog4ZaXP0Q7/MzBc+jKUFhCrB0wLGb7qDRidCCBfnouvIbr4UwywQdpZ58NtfxBAB\nbm8GU0nWbxnD0CySNCUbK6LIR2gCU1NYaYy0JB2vThRGGMKm7nWwLZvLt+wkHOlBFxrSMHjT299M\nj2lStHO4lo2WRLiOSRh5FDOCzet6cHNtQhnyure8i6Ghke4TX9NQUpAgicMEaboYAoYNhZA2Q0ND\nWDKlaGb5yPv/eu3mUDpK17ELOfIjJXLDJWTeIBEp+lOIge64M3SbnlEUEQU+sUyI4wC/1SJstEm9\nOoOjg7h9veT7ekFKQkMnVRoq7mDmu2ya+/ad5JqrL3nOPfu8QfSjKCERgNBIlcLMZLHCDlITVCKP\nUqmXTqVBp+EzMjLI4SeOkkifhbklXNOESCFViIp0Yj1h5zUX0ghrwNO1clO6mDrYtmDLRdv4l4e+\nw57tO9i16RLe/+d/xd/8709BrLj7q9+gE/j0aBahUFjC5sCX72anozOp+1hxjFMqMbp9C3sffoQc\nOplY0gma3PONb3PL+i3kG02E52G7Jj3WAKnusiw1WjJiUktAxQSDfURRwAW9w7g5l+W8z2SrgzM4\nQNWUyEh0u+QCRGqQCp2O7OBGJo7poukpBz//YZwoIJUpqbCQfpO60FDxCjnXoNHuYHRShlMXt+Uj\nkggMmzCKiZB0fA9L17HNLDZZtFwfahTaHY/UD5l4dC9LUrHrbT+PiBNuvfWF3PfVb1Crt0kSExWD\nlSiKjospEmYXW3z+A58kKY3xxv/2Lm597e2ceP8H0KWgp5hhZWmWWCt2pScMhSUVhVye8tw8pq6j\nlEbY6kDPM3fHailZii4tLOkqlax8qmv3bJCmaRiYha5aWam3B9Iud9hjP7qP9RfuQBkuSoguBS3x\n6khHwHvf+3s8evA0E7NzvO51r+b97/2Df3fPPm/O4ocxSpNg24RFi2mvjl3MYqQKo5PQqLdplBto\nls389ApaYNJebLF7yxamjTLLSxXcnE1jpUZiCBYWztGzrn/NGgcePo6INMrLZb5z932cOj3N8R0X\ncMeNb2D69CKH9z+JihLiToLUEqShUdQthryErbUWudFhyksp24Ri6shhint2krEtem2bTrNFFEVk\nhcF00GRIkzQ1nRmvw6Kdcq69QmhlMftcevQM5vpRxrJZTj75JO1mg9RrYbkOuVoVz9SoCEGoEmSa\nIAwTVIxQGrFMqUgDN+4wd99dbA4UJikrlk1h6wWcO/QD8oaOp3dZF1tJTITG/L7HufHqF/DE3oeJ\nwxjdsSmODNNKYjLFPhAC3Qcnk6PRaZI6NqVMjv5tm9j4gsuIkxQRxqSGzu1veCPfuOsHDOuw/4nH\nGRjqY2FxhrSt+OHZGR4/Uaant48HfryfX3rXr6M0ixDBQq2BryRposg5FhIflcSMrx8jn82hCUkU\nJeQya1HHT7FgAl2E1ersDP+6WqWecRSDVc6AOEGlCbOzE1x86SVI0yJa5RFTKGQiOPvEDzh6911c\n+9Jbue4FV7G8cQTNcJ5zzz5vzhJEMSQglUa75mPaGrWFJVSaEklFZamOFgksS8MLEnRhUZ5cJG/o\nlGcWWLdhnESGWHoeSxrUfY+w7sMzCF6OHjjN8PAgJ49MUF3q8PZfeycf++hf8t2vfx9ddDl3K75H\naiqcrIVScM3u7ewu6/Q/PEEyN8t1Oy6kfPwsw0IQ1eo0J87Qq5s005iCdCYvXQAAIABJREFUqVNK\nDWYadWqGQyhSqq7J1ou2Mp4oCtkStShCb1VYXJinNDpGvLJCJxSEYYvMyAD9SrHs+7QckyQVFISG\nSrpqXJqU7Hjx20lHtrDyjU+TX1ygriXEmkHu2psxxrfQPHMQkdRxM0Vk1CJKUxJdpxkpvvvj77Nu\n4zh2Q9BG0IkiOklCbX6hS5qhBJWgSbZQ4Gx5mVYS4ebzlEOPcMiFhSoXXn85nZklegqDoMfs3nkJ\njzzyAL19eZYrdVrSYTmWLC3WMBZWePhXfx0NiVIxU8t1DF1nZCCLrnWTfykTQr+F4+QQcYRlmqj0\n2awq4inxJKlACflvuMnqbz4zwqiUVGgIKUhTxeL8FCMb16GkjkwFqVBdfoJUMXHPdxmKPA588TNs\nW5pn6MobID/2nHv2+ZuU1DR0TRKnIZZlUe40aDU6ZDJ5WtUWtnDQLEXYqJBqgPAYKPQRegpdy7I0\nVycIWxSzWcxCTOR50MytcRbasHBunmIxS2W+zEc//Ndcc801/OIv/Cxve+ebaU+2cBILZ8DhnT/7\nRr7+5a9S7Mlx/N69uEqnlOjMnZ3AzGQY2riO+/ceYKuWRU86SKXIJylukjBcKNB3wUayroNZqxJr\nNmE+i392hnLWxl1YoqEJwlQRJwpdSIadIqnSqUYey4HH0LatTD02h9BMiBUtXaOUpjT9KsValc7S\nFAJFIhSJlEw8+kOSH99LRvpIy0EPOySajmEIojRFtww0PUt1eh7DyOAHsCITpB+QS3XymoalpWit\nKr1mhtkwYb1p0ittTk1O8+u/+z8ZNDPwdxCbOrVII418oiRA1yTVSgNpZZCxIk5DfJkQoQiUDih0\nTe/SqxoGZxfL7N4wjCZ0EBFefYVH9h/qTlQmnS6H9TMsEQKJWOX5+7fiyb9hKkUQEmMj4pSl2UnG\n1w0TYaCnEkOkBEgiYmi1ibwG1SQksRwOPfgDNCNl6KaXPOcSz19kCWNCrcvIHlVissLGsTWCICKv\npdhC0JftQevtoeN5OKZJb28Pi2GTyAswHJecbqE3OtSUzbqdG4i0tVQ2ofAxQ53IDxkb6eG3fuMd\nfOTvPsY/3a2xY9cu3A0Fzpw7xsWXbeRc8wSv/6WX8vUPf4lrsxfRqC3gWBaJ0Am8lHHdISXC0Rzi\nUMfI5DE6HmkS4tWr+JUSWhQimz4TDz2OrUHSbKER40qDTJySlzY1w6XWbtGfsQlSRS0MKS+22HjJ\nVZw0B2hJhRG10eKYWOk0H7yLqtfC1h06RhYRd4ijFqby6Uid0uAGnNhDiQTNNjCFTiAEKYJ23acT\n1THjBE1X1OoNRgaGCGpNDNMhCGM6WkxdpLTshLjUS8bOc5HSCQydBZFABGasMERAmAS4BniJQGaK\nJGgIATL0MbSInmKR4sAouq6zXFkm8HyCwGe0t49ESfrHN9GZmsDVbfKWwekwQotD2s+aUBSksMoS\ngwK16jwK9ZwRJkUiBSiZsDh7jj1XX087TNAsgSJBYmClCZ//099DD1u4loll2WhKsvfBB1kf/Bet\nhqWro6UJCkMpRmyHYdtm3HXYnutlfaGIkUYkkU/B1LHimPLsDAvLVUg14o6PI1xi1+TSF+0hIMDJ\nldascd0rdrP+whH8TkqlHfP4kUO85w/fxxtf8RpIOrzp11/HHddeyWsuuJnq3gonHznGC4sXMOBB\nftdW2LaJ2nA/0yMFzmkJI5qGaLVYLhV49Sc/wsr/x9x7x1t61fX+77Weuvs+vU7N1MykTBqkh0Ag\nlIDKRW4EAbGB3guKohIvgrQrCILG9lPAAiIBBEFMCCUkkARMbzOZfqacXvbZ/Wmr3D/2mUkmavT+\nuPeVu+Y1r9cpe+9nn/2s71rr2z7vXA4pLWXpEi8sEc0t4i/VsY0m68fHKPWVcIQg0xZH6V7otFIk\nE5KVqM1St0s3brNuvI+TjzzKWVt2cvHLfxzjeAiRoqxFRy2ioMA5b/o1dH+IcgSxESghqZ69k30V\ny75CSrvdwJycY0S6vO7FL+WtL3wZ73/RS1gnXXxpyBuFjBQplmbUpY0lDhwiAQ2rOKEld55c4LvH\npth3/BCbU7BZTGYydJzSyVJSq+mmCq0UWmU4ArJMkakEg6ZYyuP5vZ2lsVqnVquRpim1zipTi8vc\n+chBkuowi5miNFjBDXxcx2dm9swSfWdNnUaxpless54RmF4yFOSaAy+fUpqxIHGQxtJaXqS/UkTJ\nAoWwgCMkxoLBgIH+iuSO7z/AYjchQxDkCxQrw0z/8MFnnbPPHXJCOgjpIjEoa3rlBq7BCQQ61cRG\noQJJPYnJdVNKfp6ucikEHjpvSZI2jSRi5KwRltuLSC9k5njzjNqwc87ZwyN3/z0ELp2u4Jorb2Df\nsUNo1eD8C3bx7Xv/iWvTIre//WZYMSzEyxQGi+y58qXUZo5RcxO++/ij7N59Dnc9+kPOzzSulGyo\nlpm5405GopSq9dBKYByPOM2oaImXpfRXysx0O1TzJUwRvFaXfbMniXIFZBAiHBDCEIYh+W7Gqm7g\n50Pu/trfMSjzOCag47sYncfdtIW9xw7RbS2ilMZ1fToC5lfn6A6XyKzDDeu3c2F1mEY1h390nkIr\nxkZNfNfimp4fFDsOq4FDPXCQnTaJLwjyOSp+wOUvfAlP/PA+gr4iZxXKLNaXmZk9gjCGxhr6XFiL\nwCKEwjG6d89MgnTBcz2EkNRXG9Rqy7TX8B9aQKMVY6yk6Qo6BzpIL8eG44dIEk1ea4JCcMbcuOMv\nPktQrdA/McH49q2UxkdoG4srNL6QaCuwQoJ1cOjxuTMycMAXGXMzx5hYvw5sSmYEjjgVJnBIu3VK\noUdp/RaiqMt8llBdN4RxHcqVH7FE///W0FmPhBurnopJ6jqoQKLQrHbr5IMiWaboZoaB6gCqlZDL\n5REyYTVqEYY+aEWyuEpdhOhynvrCyhnG4tsB4q4l9C2EGZ/5ymfYvnMzV1x+Cb/3kffzvne+jeg7\ni7zhS//I1INTfPrd7+Dx1gmueewAc8kCHTdkd3WYhdos242LthofSWF6geN/ewsDOKSOBmvpNJvY\nvhKVQo7+RHF4eoZmluF0mjjVKr6UVCfH2LBuKycf/AFenGCa9Z6f00jI8j6222UybtD0JTmR4OoY\nnWrMwYOsnJjF+mWkrCO0pV0UtAsZAwNlZLXMw4cXWKd9RKdLPrXMi4h7Sm2ezClsBsoXWN+jbTLO\nOu8cWicWEJ5DVwpqqzXU44cZK1eYmjqJCFr0DVfQLmRKII3BpRfqt9airMIkEfmc3wthW4jjCM/1\nmZqeQ6UxcIpJacEaHLeHiVipdwlCzZ3f/S4ToYuHoLG4eMbcWGw0UMsrzB05xtTddzE4XuW8qy7B\nG51AFnIo6SJw0ULjZ5Z6rQmui+8FNNorrMzNsWHDWXTnFihPjp1WyTRC0Fw8wmo9oZkbYKQUYa1l\naqnNhkmX1D2zkuCZ4zkzltQYdKaQUqISRUSPH+IInxSXZqOFVj1HcaFRJ1RQdHNYmeI7Bmk1Vmu8\ntqLWWKa6fozR4IxgPV/8/Ofp7yuwZfM6atEq0zNT7HvyEe74zjfprwzwrVvv4dqNV/LQhz/KI/tn\nuCyocDz1qCyvsLtc5ptxjUhkXLAkCLspriPpEw7VxKCES2YVwvacU5MJjBVEnqRsXJ5cWWbDubuI\nG6tkhRxOZIn9PMuzi8zOzlK2liHroNOIYphnRRsGq8PMLC5ROedC4tknyc0v0ykl5CMN5Tz5XVtY\nuftOROhi1/WhA0ucdGmhuG84ZHb+OPmliGu3baf/2j2s2znO8rv/mHCxjVfOUXI8FpstmuUuzW4b\nN9V0pUQ5knrzOENbtrBx54s58viTnOP79AuP4yTkrYOWTxeD8LAmpdNdxXEsSAj8Hkio1Wrhil5n\npOM4vVDu2nN7dWGCJDMYJ0dDgl/KEz6jHaoRZ2CglWg8LYlbK6iZr1EKFf3lEl5fEeM5pDZgZnGV\n5WZKnDpEVrPxvPVMbN3Jn/7hX7J99Cyuf8NrcIseSrh4OmNlapZH9h8hFv1InRGU+vjirffzhpec\nT35s+Fnn7HNmLBaBWGtBdVyfjmMgcEmSDGEcfOHQTWMQhvmszuS6MZbShJFqAasNUcfgCo+mowhs\nheZUE+FmsP2pa4S6BVKzMDWFDV02nLuOiYkx9u7dx1mbt7Nlz4U8fHwfR77zZSauuoLVWw+zZXiE\nMAtYroQMzQsqxsFPu7iuZMx65LEYF4yVazKkEqkM1khc6dO3cT2DrTZ1Ui649HLu/eZXWViqY5eW\nWDYZo5NbKQoHqWIC6VLK5Wl2YzLfZdlJGbnkeQyPbOXJx+6HQBJmeRIvIevWmHvoYbzNfbSrAbFK\ncHDotBt0jOZA03C8KPntP/kwV1x8JRkZD37sTwj6BrGLMTLqImSA7xiWZuYolgtk7ZjBUoVIWHS9\nzvTjjxGNrSC15PD0DC+orOOLiwfIpDyd+wCwpgdedRwLxjnd+/7QQw/juh5Wq9Nq+5ZechB6xqON\nAgFbdu6mWMmzabDK9287EyDUijPA64nkoWnEsBDnqAqDv9DBpYUnBWlmaHsu850E1y3Qaq/ysjf9\nODLoo70siEdzGBxYgypZJPNTR+lmMJqPOVlv06wJJibHyLoJi/UzyXHPHM+Zg697hDq0BRO4ayJt\nKZHOwDVUh/NIX4CQVEarrHa7zJ9ssu/gNOdf/XxqSUSzk+OaH7uOrspI4gSpz9xGR0aHGCpUyCcO\nZVlmrDDJ2VvW49o85563g7/6u0/xrQceon7+Fo7deT+dfMAFZ11EJzBUtm1mfb6Kt9qhJR1Kk0NU\nogyhBdz4KsY/cRP1goOOUjrCIJXgRD3i6IljTE9NM3dyhn/Z+xgrUw3KnYSzggrrRY5OZ5U4W6Ho\nBxSMoOvBikyxLuydnkfXljn49T/DwWATQ4ahKwWrYYyqRHSSFvmszWi+gh9DGGkGu4INpX7ybcXf\n/sr7ed8bfgHVlrz41a9kOV7BqRQYcEKkSRnxXYTjkJ+YoOtp5manYHURP04Y8ENah07iakEclMlU\nm0qS0HbAOrpXZmLAOCBdMMKgychUwlK9SWwNxiYYm2FRaJNiTAZGYXUGVuNkGsdYoqjD0QPHeXLf\nEWRyJvg0SxUqTcniLknapWUSFulyzCQcTSQHEpcnupL92uNEIpiPEpbSiDaafF+OmZkVEp1BsQA5\niZAKFw9lY44cPkI3KDMqLciAfSdOsHF0iI6T8blb73nWOfvchY7XZHIybckigywaZKDwpKDbirGZ\nQBhJGlmSpYQoiSjmC2zavZGHDj1KcaKAWo04fHQf+YqkNDDISnzmytCWmkS3UZ4h9DVnX7KdxdpB\n/vLmm7FuzB1f/ife+8tvY/XaBn/5ul9lJ1Xc//Zyir9yAr73OLmsyWh/Ga8VUzzRYLYKJk1pff4r\nPPT5r2BJiAs+sUpJvS5quECrtcJOPEqdLueSoxVUaGWaw90lZuMW4/0DjOoccSelJV1qcUTiBFSr\nQ0yWB2i3O8TSsuncs0niBk8eP0JlZJBSILHCILXk5PGTdNMaeD6u7xN4AWlbg9R0XM2heo0o7XBk\nvsYXvnY7N732p0jjRZRboN1ukbVjVh7ax87BIdoFjyjqkPg5VrSm3DeAU62w8aIrmHrgLnZWynQO\nH6IbuCSuJZ+eyqSfGcY9tZOcEql4uvSRFhIrLEmc4LseY+sm6eqMuZV5TMfFqjNDtqlSGK1Q1pAp\njRA9zosSEmEtSmukdHA0+EGAUgYjM0rFkDTV7HviSTKjKJVKuK6DRaOEJhAeMwtLJFGTbmrIlVzS\n1JJzNJ1E02j9P6qiH2WaRNserNNKdCZQyqA1ZLFleaG9Bt8UuNbHlwKnpOg6XTbt2srzX3gVbtky\ntzTPtrMn6KQdkvDMeP2ei8+jOj5GcXyY+ZUVbvnyLfzw3gdR2vCLv/BW2krxwJEnmGvWeMWb3sA1\nuRH0Z7+BaHZppTEraZOuiZEoukrgd0JKbYfAZqy3Gmf9MJf+0Xt59fveR9sIOrOzRHOrbJA5trkF\nolqNg3qZh1dnqDsZ5TBHSTnEQ30cyGmOBUB/H+Pr1pOzltbxY2zuH6NfB5x8+Am6c3Nsf8FFbL1g\nF8L1SBPLkROLuG4f5XI/v/Oe96CNJtMpcVJneN0En/mnf+R1b/l5vGKFy656EUenD/NjN/4kq2mM\nu9RiY+pyLiFBN+LA8iwrFZ95T+IV8mzauIHQk7QWZmg9+SDZzDF2NDrctPv5BJ3e0apXt2ZO075O\nGYl5Sq/1tHbXKYXIMMwzPj7Jhg2buf7lN1AolxGu4JILzqfRWqKZJWfcN2U0SZaSad1TmrSQZopE\nKbpGE1tDYnsBh8xonCDAYNiyfQNhscTU8VmEIwlyYS9PY8FYg6NgqVGjMXuE0BHU2ymlQhkT1Yki\nwdj4v6mt8tTf9X9s9v9vDiUFKZZYpwin5+D5bgWTGKT1kCYg6aYYYtJOnTDvsOvCTUyu7+fB+x/l\n/nsfotlKaTQSdu3ew/rNW9gwsfuMa8wcrXFyZpnZ+VWK/gAXXXI507PLvPlNv8T2zRewtHeJm9/5\nEY79/i2MfuFBQg+C+x4h9TI6vkRkDpVY4oiAqL+fjb/7yzw2HOAkBrRDuBiRJYaZRp0+fMaMpOXB\nt5sn8bZs4JGjh5hpNOmvlOlXHnWj+IGtsf6srQyNTbJhcIKCBkFMe3GWUU8iCz6JNVgynCTl4gvO\n41vf+Q65QoFas8W5l15Exyo2bdvI/Q/9gMHhfv7ra1/JV/7xi/z15z7PF77yZc5//vMplXwOHnyC\nydENtJcahOUiOT9ASreXhQ8cWlaRGxxg13nnE6SK9sIcZaHwug2mF2YItpxFo9BHljhcMrgRq01P\nlFvr0wJ6WvdYnI7Tk1WtVqsUSkXWb9zA5OQkIyMjDA4NsVJbRUjJvr37ufLiy/gv199Aslpn+7b1\nDE+Mn3Hfoiwl0YokTUmy3jVTZUiVQSUKk4JVEEcpwvHItADtUBoqI0RAs91BCqiWKz3ZKOEgjWZx\nZo5UabadtR4f2H/kEMViERu1OXTsJP3VKs82nrudJY7odGOUFkRRSn21iU0U1WK1J8SGwUFTdHyq\npUGGCyN0DnUwtQy93KU2vYRRktff+BZeef1refXrX8HlP7brjGtcfcWLueDCS4jThLDo8flPfpal\nx5vMn2xwz99+k/21mEtbfewwIZ2laYqZpJtEtFzDXLYKToHEdJGu5Xkf/l1qF7yAK17yerx8gHAU\npUwx9VvvYf+f/jEykBRMkX7yjIYVHnroAWzBxw1CFkyHI6Fm8/B6tmzcQt/GdbS7KbXWKvHiIo2V\nFXKuw2C+yJNPPIybyzEhXdywytjkANu2bGKpdpjfuOnXWV2dZcuWceIkZseObbSyiBe85qcZ3X4x\nyyttbnz9z1AI83z243/MH7z5Z1n87J8RimMIJyMKBSdtRL3gMD44wK7qBMvTcyxNH6NuYsJmRuRI\n+vsHGR+YYPOuS2j5JWRflZe99DryUpLP5ynketJDvu8TBAGu4+N7ObAOnXZElmXMz8+zXFuh1qgz\nW1tmcuMGpOviu4apvT/gY7/767Q6TZCQD8/0NbMsIVURqe1xWLIsQ0iLkAbpGKBLSoQJNZloYWiS\nZZpqaQg/CPGdmEpYojiSRxiLtD3tzNnpY/z0z7yesmuxro+WgvFyhb3+GNe88e1c8/KXPOucfc58\nlsAVtOMuSdKlWCxSzudxrGB1eRVHOqgsYWx4iFB4EHqMbB+mbzKHG0iumVjHA/c9StbOuP0fvs7f\nf+qTnHPdOTTsmT7LP3/7nzj82EFU3dBwuvz8G/4bf/3uP0ELTTvI87wo48a0n/yJBiuZIH3dq2h/\n8jOITownQpYmRggXU8IYbr3pdyhccwnHb72dfiKkcrA66enrCkNZSxzPIdCKjusgyyEn6vOMX7gb\nsdxgXamMjA3t1TqJtmStFi0B436AMuCUCsyomMmREezJCE8olCc5emCKZjvm6mtezAc/+BHiNONX\nf/Ud3HzzJxjsn0TFCU5i+O6tX+fR++7CteBFCfXDxzl/dITWunFe+ppf4/Zv30infoT+/n4GvRz1\nVpNOlJG5BlldT5D2WI2mFJLgMj97lJOdFexikwMnTzJ0vMCOfB+PJ43eSs1ToeQeOxLAEgQ5LOq0\nQHe1WmW50WR2Zoq+Sh6TGc6a3EZ7dQdKKZK2wnfPFLdbWJxb496D47iwpmLpyV4vi8XDNZaMBGUT\nUm3xnQraxCjTCzwcPryPd/3y2/F9SRBIcrkQ4wiiWhPTddl/+BCdUoGlIzOM7dzMn3/lK0j97Lji\n58xYqjkfV2mM6CEKkiglH+Yo5Au0OzGh55BEXRAeKhXMzVpaeKSdhKyT0Vmpk3YUJ9IE42rabc30\n0vIZSckLLjmfdFFxbGEK33jc/KFPMJyvoNpdBoXDW8tbaaxo8o6lVvU459du5ORff4GubBH19bH9\nda9i6ff/DOPB+Nwy6iu3s9mmKO3R9gVuJkgcQVMocp7H+p94Keu27+bAX/0NL9t8NkeXarSKQxjj\noVaaLHaa6IVl3M2b2LRlPc2pEwgreqzFtEPHTclXB8lJByfwWQodvvGtO5ibb/DVf74NbQsYk/HR\nj34MTzr8j5vejV/I8etvfwdbB0Z51SuuZ6AYcHzqMJdfdw3brruKxEo0HsNBgcd9D6li2p0Yx1oq\nLtSNws3lKeZL6OVVarUmIk0IdUbJC6lOjFE/NsVA7HBZeYS98SpGGVgT57a2JxR+iggWRRGs8VaE\nEDTqdRzhoBJN0s1YN1jGpDFBmMOx4BgfnZ55wJEScCSOFPiei8SSKk2juUJ5aJCJdWO48Tye3wsz\nDE2uJ1ccZnr+BJnWnHf+hSzuP04gBUJrsq4maiVoeq3HsQ4InBxK5qh6OU7sP8H60WGuvuwKvvfN\n7/+7c/Y5M5bR8VEWF1do1Ju0ux1AUl9toa0lyVIkkriboqQCz0UvtGksK4ZHB6lNr2CVxLMhgesR\naYdHv7efsYmhM4wlWY1oLtVxrUe0mjBUquAYh2qxxE+744zMS8Zv/g32v+tDFFcjjv7FLQRJm3rV\noVtbRX/wcyxXM8rtDNf3yOkUKwWuHxBkMZk2tPwcrnTwY5jYuptudYiT7Tby8QPM1uuMTExy6InD\nDAuXQtGlWigRGU3OCel6IVGWkbMWaV0qmYNWljj0yceGG17zEyy6ijjJWO0sMDvfxjE+y8tLzE7P\nYLWmnnbItTs0nAaLUwe5+Kd+kvNe89JeWbu0FLIAogTKPle+4Fr2PfQI87NzuFJQsS4T5SL1NCXM\nhczPTRN6ISXpE1iN22rRNzROI8hR1oINxQJyPkESoE/lUdb0i+XpUvmeo3+qz8RaiysEngzQicvu\nzRtpt9oY4eJohevkOdW5cmpIaRBSgDBs3b6d177hjWzdsQO1dr12c4aP/e6bUGoZLTTTsycZHV/H\nhZe9GVfkeNELr8aVcMc3vkmUZnS6Ea5xyUmXVr2GirskLnTTBl5lhDDWHJ87xue+cman7TPHc2Ys\njx08hFY9fJ1wcr1QYWawBlzjoo3BGI9YGoTVCJkhpWXu2BxYB6sEwggK/WX6fB+VKfRi3FMqWxv3\nfvFOSn7IT731TfzTrbexdWSCo8cOckGhymWPRHhv+ynk4ACTjZTjWYz5w09Sz7qIWkZZWNr+AgXl\n4Dt5tLW0koysUiSuVmnXlinHGdoqfOvS77g88rE/YlmklKKEms7wQg9/eYmBMKRjFHGn26sGXGlT\n3lBmpV6nFPhMOjmmrWFed4miJoPVArlVn8/89S2cNF0cX+DmqsSpIBQpwlqSdqe3+ruKchaRNxH9\n4UHU8tfgxGZ0OIb0K8jiODossueGlzP1re8hU0UhDFlONJ5Xphr47Dt4mMLEOGO5kJwCYyyiWiJJ\nExabK6TFMoejFBlHPbw2EmOfYktaq3ul9EJgMWBBiKeJdmtBmiYU/ZiLd2/hS3fcjzQeRuTIjMDK\nM0v0r/+xVzI6MsHk5CTDYxWK/f091LgUODbl4x9+C4VCmyRx0ZnARIbV2Tp7776VotJMjkVcdbHL\nS1/+szjOEFKO0mon3HPPPdxz5z2sLLZpHTjA+ds3slhbYqivhFcaJyy53P/Q4//unH3u2orTXt2Q\nFOa0mDT0QnwI8PxT4BwIQhelEpQ1lHN95HN5sm6MShNe+9pXcsstt/Se+4wVavfWzQRBji/9zV8R\nlIrcc//3Gd28jk0nwfFyPHTzZyg5X2Wb7VWjLsVdhNdTFRG+pasgC3ykE5OpGOnnyF3yPLb94hup\nztX55k2/jadWcMkQUtFpRBRyITbr0PCLREnCwNI04+s38fjhg/hugMg0M1Gd7PAqF5fyCGU5qros\nGEWu3IdX9Fi3ojgqE7au201gu0TJEtJJsHmwjsWTDknOQRddylHMjmqVC0fHmf/2NO3bHmLQHKU5\nPsrKa7aw5ZUbsVay8+zzuOcPPwmZYqzUj1cyTLUbVOdrTEiHUqzJl0oo3SVKLdp12b5zK/mRcfyx\nMaqjE9x99w8RM4ewmJ5S5dPGUzvJUwCkUzuPKy2BELzoskuwWYrSDgoXMPiuQD+j3OXy615Cf38/\nR44cpB13KTu9+q5HH36AO7/3SVbaByh4pZ4MrJXYDLTt0pjfy5F9GxkaPJv+8iye+g7RiiX084iO\n5pIdBS49/zIKA1vxc1uI4i7fvvUOvnPXXdQaxxDe/6OYPHBwHBD2lKq6exqUKZ1eaUQYhPiei+dJ\nXK+K67pgBWkSk2RNBgcrPLH3Qfacv4skScgFHrfywOkrPLrvMJs2b+YXf/O3+KMP/wHXvOJ6Nt57\nnJe8/x10fu4PuCAGmzRZrKbIWNLUGVplBI6Dl3SYHtnGtTf9Nt/785txjuzFM4aZB+7h8UNPUPUL\nxEkN5UCgJG2raRUkFk0SlLn0F9+OJwX7P3szm/Mt4m4Hb6wfF0H6xt2ZAAAgAElEQVTJSuK5JpGU\nTKs2WV+JcqVIMt9mYr5ORMY5uSHmo4Rz9lzGI/d+mX53hZyb0Y180Bbf8TBzXV6eDXLVeVdhak3c\n5YhGyWXh+rMZefOPkcsNoIN+rE0xrQZurPBcH5PzSOYW6ZqUySBPn0qYq9fYuWsjnUbG7k1nIXM5\nrCOwZBBnJJnlqle8gr//1leJPI2D869ARqfGqZzLKSSEFIbRaolzNo4yVCrRaUeYnIfvS9rdFjvP\n2XnG8z/w8XfhuYLzzz+Xn3zl6zEmQIiUB354Jwce20eUuHTDmLGKj2NdslQRtw2OkCxFS7jBJRQG\nFY36QzhhzGpnP3kvJi8FjqyRtQ6TdYokieTaS/u4+vKfoFDZgXX6+Myn/v3W4ufMWBzPAStwsCB6\nTTvCSAqFAlJY/CDA8zyKvouUhiDnUi6HSBFibER//1kY3aVa7qc4MkTSiamvNuBpyeCtWzaRasOD\nX7+DDVsm2PDkIluPNTj8no8TqDa6JNF0KMYe87ku1i0jVps4NiMOcgyfs4OT24bZdeMbefxDv0GC\nIh9BsbOKEiuskJIzLr7SaAKGhGBeuLhWsFBbopXENFWO0uRGRkfyzB09gbPUZMuWbdw5s8SCTXl+\n/yhl32MqSdiTK/BIq8bg5k04kcs21+HLt32R87ecRWk6wA659B+bQYeSRGqcLGAGzZ9/5y5+8bfe\nRuudo5S3X4zn+ax+7S72D01x6SV7aEydZGBslK4DsuwzfWyacuowMTbGzPQsG8shm0crDA0V2TDR\nhysc8CRdo7FW4yYZSaeByPlUSzlanQZCuz3FyLWseu/41WvbFb3YP8L2DMf1BeiU/kIBWxhkYHiA\nmbl5yqOjlAc2oIuDZ8wNwQLGuOzf9zh/cPIPueSiS7jhRS/neRecx/fu/BytjqSQMziVHlQpXymi\noxTPhcUj32Whdjl37G3zyMN3c90LN3No732M91u2bBynrz+lWsxTrJ4H0X4c5omaB6H9IDr+0eRb\n/6+NfN7Dkw5SSMKwp7BezOUJPIF0oL+vn77+fvpKpd4RJ2rQ7bYxWmJxsCR4gUchyLFOFvEnJ3i4\nu/8MYxkZHsZPJXM25oLWIEM/eIIBFVKdj1gsCKrv+xWiX/koC6JXZLj+x1/D97/+92xuL2ETB+78\nF2oo9j/wGL7VCBwSbRBIOlZjfB+hJK5jaXoh0GEijWkJyeFbPk8cBDhxk9sOPtxj1s/W2FEcZO/s\ncTZfdB6LM/NIQtIkRpuEwOvDKxTQicL25Tl26EnOBiqNDoKAzTvP5ZD0aC8v0RzJs+vJFQYu2MF1\nr3016lUvoCBdstU2kQuZH3HJ2XuYOX6UkbN3ELWgf/04J5amiYWlGDgkSQe/P6Aw3s+60RHCIFi7\nJw6ZtTiOoJN0kW4XmyUIbbj+uuv55Je/uMaMF6dLX6QQ/0p5pRdSdsiHAeV8gEATWUG92SE/OMiu\nyy7lisuvotbq8rF733v6vnXaiz1WSuiR6JR//vY+fv+D70Z0oTwI0svTaM6T6Tx+KPEdFxE6GCKG\n/Tzfv/WLXP2qn0XXKtx7xzfwc03yOZ9GNEiYrmOs8mKsG5BxGKlj8mHaa8V+9lPYf85YhBAO8AAw\nba29QQjRD9wCbGBN69haW1977LuANwMaeJu19pv/1muevXMjOS9ACkFfuY/RkRFcJLmcSzfukGYZ\nSRyT6RYmM0ghyecLtJodpONRLvbRNzxAmYCRwiD75uYJChV4mtyxX6lw4caz2fD8PTzxxt9ksqVI\nSx42FXTcmIFXXcPKOz/BrJPhbbsY77++nhds38Kx97yH2GmTz5o4d9/JiMqIHAdlLakQpMLSEbb3\n8TlwUvtc+/6PkCtJbn/7L1HKUjablCjOOCFiSksttFDovhJHTMrA+Cjjk+uIooTpxTqF1Rrbz91J\nZzWjS4eC1cweOEQ18NkSCxaOT7PtpS/n0QceYVk1yXcE7/zdD9B+8//k5MEFBgqjyJbADkj8oQLt\n7/8Av2BJbMbw1t242qX2wIPMtVYQlQLtpSZpahl0U849a5L+fNCDop5SSjEGKQ2OBU+CyWJIExxt\nueqSK/nUP/zDaTiRXcN9n7KUp0jGT6muKJ0yVB0Cazgxc4zAsZRGxnHcPHfdfSff+NoX4YVPmxxW\nARFKL+M5eSpDObaeO86j35+iuRBTKEmEEszXLJVSl2KQEDo5lEnQTdi3/27iCM6/eBP+xC7y1Ygt\n55zFWVv2cPDALJnp4JlJhLsFo5poLbEiBfvsANb/7M7ydnpi36W1709h8j4ihPjNte9/6xmYvAng\n20KIbWuMlzPGNVddSi7MUSkUSTsRnuuh4p4TXypVWJifpzxYpd1t9eqDdIoxLsNDVaqVftpRRNIy\n1NyI/XN7cR2fgeHhHlppbbzxZ36OfHGAQ7fcxvCJDoW+QUq//Fae+L2bCRsJ2advQ+mI9Srg8NIU\npWMH+eYf/TEDuos0DnXPUsg0ie/ja4MSFpxeJbQ2DqGB0BiEB14iiLtdQi2xQpAKhSMkA5lHU7fZ\nOL4OsW4YN8yhVurM7z+Mci3TjRo3XriHqSMHaTtlVlVG0K4xqQUbwgItG+MZn/v2PsJ4PsfSbAuv\nWuYLH/worBxkPF+lu3uQ7pe+Sn55FbmpwHCkiS7eglcqUHtyjs9+7A+JWsu0ozY6KNLyPFaymKWl\nFfasX4ejnB40eK091wJWZwgJrvRQWiHSLkmrRehWKedLNKPOad/EAtasBWrgDD9GCIEWmvVjY2Rp\nyuY921n6xm1k8w7CGo4v7MeVZ7JAQ8/HlZK426UdK7buHkMNdRiYqDIzvcD8YpdSLs/RqYz1GwO6\nThdpDal2qB+pUW/5LP3gEHP1OldcPkp/0MfWDW/Cso4dZ4F1FIiQILwIETogmuh0is7Je380YxFC\nTAIvAz4IvGPtx68Erl77+m+AO9cM5jQmDzgmhDiFyXs6DQwAJ0vYsrUX5102ijiJibOIWq1FkMsz\nNDqK60gGBvrJsl6YshgWmdi8ncHJCYJiEWs1SdJAOj6OIxHChff/96euEfZhdIe5j32OcZkyU+zj\nwp+8hvGPf5pmfZ7VD32CTtrEkz7rp+d59L//EhUdEMs8qZtSSAHrIVNFI/RICxX6lheoW4XJSRwr\nyGUCKxMeeN+vYG0PR+oqH9wUV1mavqCdC1kpFZgoVWkvLNCIYuatxi8PsHPreSwliqUsReUMS80u\nm41Pv+My16pRzgJ0mFHudymcv4Pnq80c+sdv0F6ahwDud1o4H/gwkz94kvUdgVut0Ng8QnJ8jvv3\nfprjPixmKVGa0jIOoZLYJEEqaLgVHlhZ5crRKp50MFbTUw22OK6HFC6Ztigvwe3GqLLE4nLJnnP5\nzg/vwVqNo0xPLFGKp1Toe+Dg0ztMNchTDnrl85e+8EXkP/wX9AlJuz5LwfXopGeW6HuOSyglxgmo\nZwlplIIrmdg0RpQZZhe66NjQMTB7QhB4AdLRnJxuYmRAKe+yYaLKk4ePcWDqKP/zpteQxcfw/GGU\nl+GIHEkW4zkarUOsrOK7m2kmX/3RjAX4OPBO4OkNyj8yJu/Kl/4E1eoAQnpswWKURjoScLBWEqeK\nMAwBgxAOWEucRoRBgNaaDAO4hLkBrDEgBPoZ+lOO5/K93/ozLv3zD7H3599GtdHijle8jnM7bWqO\nResU5buYKMYan4yEOGhRTBxcY2mFoKTCtTle+Om/Zjrt8sB730V8fAbH9ErUMynRIkAZgyss0kiM\n49FxJI7OSKzC05Jmvcn04iLDY8M0soj5RguiiHqhyolul0aU4vgNtvUP4dVWiVWC11dkyXFYv+dc\nFj1JLnBRrmTzy67lnq98lSBTxDrlvrvu5bBV7A7LXCj6mTu6xG2deWo5aFuPttCEjkBbSb5UxLgC\nL4NUWZ6YOsHuwT68WPeijlLhyh4zRme6twBpDSojilq4ScB1V76A7957N0brXg+87EU0tV7L3K9t\nLKcQ3H2VCnHXEvRPoJwif/TZT/Hxm95Dzg9xvX4mh4b5PkdP3zfpl8hkinAMqpMxffwE5dIQyliQ\nkq3rJlg+dpK88umuRizGNQaGq+RLFaJkmcmRCZxWgwvPPpe9B4/wnfse49zL97Bw4s8YXv8zKNUm\nNIqureOJJk/84G/5/pcewcsf5dnGfwQzegWwaK19WAhxzb/1mP+/mLyP3/xpWJPjvPrqK7nmmqtQ\ngl7JvpTInEOKxTFrsh6A5+dQ1mCkc+oNoq3FmjVW+zMlPRPL6oOP0Pi511LpGAKRsH26RSJ6S189\n7mJzPtZ1wSiEtETKMOvkCBUMKghVhGdgbt9e+jduorO0QuB45EoFkmYLbTIyZXCtIMEipEu7GHLJ\nq2/g23/3+Z5jnFoGR0bIul2mG02MyfCkhSxh57lno63L9L7HUarFZL5M3Fyl7lvWbZhgeHIEp5Bj\nLCyS6ZRAhLQHqpz/kpdy9+23kVhDxwuZL0FTunxBziDGRnASQ6gkxqQUpUQaQ9vtKesLKdG61yff\nQfLDEzO85KwNSCuJU0MQCFwp8ByXNFUYHNJM4akOjh1gtH8IMo3nuphMYbXFSnHaT3mm0tfKwhLp\nxAADE+vwsxUqrf38/gfewnt++wNUh85B2zPLXTaO7ubAkSfIbBchcjRXEuaOTSFEFZmBtR3GJifR\nGvqqFWZPHsMPXC7YtIHVxhSbhobpK1XJT27g6OwyTxxu8O53/S0idXnH+/rI4h2Ewxci4mVu/+aD\n3PdAHVEYQ8idwBP/7kT+j3aWy4BXCiFeBoRAWQjxGf4PYPJ+573vQloXY2ENV4MjeoZh4PTHfcbH\nbkEIicOpbd6AUFjZS0jKZxjLtz7xcbY0Imqv/E3EGqnKTxULIeTiEka2yHSGZ0EZRUlITmT9XPDe\njzM2FPLP73wLG9OIxbyl/tF30ybPYNTmRDGgmsakjiZxDEqYNWkegUKx6ydvIN2ynmywD7mckNeG\nmZPTzGlNKjWVTFO1Ei8nqc9MY7wCTpzCwixyXDC0bR3rN02Cl+slWo3Fi1MCBd0iVDOPwvoN7Ln+\nZdy//wBNR+InKQeKLn1akOtq2r4m0RJpJalWWGnRBpJu3Fu9JGTWkuCwf3aZi0ZGGC5C4hiECJEy\no+j5BNrFGkuaKtK4TRy18HIBu7fv5OFDT/aOXuYp5x56haXCPiXaHeYKrDSXGNvch05WSeMmhWqZ\nmz71JX7h1a9mpK98xrnF1AM2DpxP5im8smX/E/soBz6H9h2jks9BmJCV8/z8L/0q0ydO8Dd/8v+h\nlyNy3iqNrM7uqy6ApMLSXEqjEdNOOyysRvzDZ36ARwB5RWxDTtz3GeTRr/E7/+NdLK1EDA+9ifd/\n4Ewu6dPHfwQzugm4ae2DuBr4dWvtTwshPsKPiMmzRmDQGNETfxZrp2VhBdI+bW16ZhOBPf3eAIFZ\nU1Xv/ezMOIL42veoRA6JaiEyjcay2BeS7ygaThfheripJm8cup5E6w7DQYCpRUwpS1fmSGQDpS2e\n9siJiBUHwiTGF10qmSbFJxUSbVWvTNb12P+5W0iswcsUygq041CJmlSGRjm6WiMvJF0yVGppLjSo\nODFZp4WqlBjeMcpouYInXbRI0TJAi16WWwvbq0lzXLAwNj7O1m6LgwsnMI6grA3GE7R1r6NSOGu7\ntOrVblkn4LH9MyhHoYTCGgcpDG3f5Wv7D/K6XTvxK6BSgZUpkRAIx+sh5JBgMowCpTQvuOpaHp86\nTKzjXm7FnLo9vZ57g17rz5f4ns/JuTbFgUHiFArnvhjcPIv334MTRTR4xn3rLDDSN4jrBcTNmPNH\nd+KERXZtOocHH3oQTQxZwu1f/Suk8HjrO95CLizz4GOP0J0fZLWxj03r/wv7T9yDyVqMjK5nqbPI\n1InPo1SbweHzGK5ezNbLP8KWS2+kduiL+KaIOFMq+1+N/908y6kj1e/xI2LyTq1CklNn3DVWuuyB\nPU8dqazpOYmnSiie9gI9DauezfR2HXvmziJShekmaEeRVxmrOcm5X/gct//cOwkPP4jjlOk4kqEb\nbqA+u4j7yH2Mp3UWPvU7HFUKS4u5gqSUhWSOoouDFmslOZlFGEnmu8hCBR0nuElKLCwyVfhYEitJ\npKTr+vTpmGxlgYlilZNJg6rvgQloJjFef45O0zA+PEzgewhr8aVAOxJtNYky4DroLAEs0utFmFzp\nsmvH2SzW67TSqKfiaAx+ziPLUpI4I0liLIYgB0GYkWZNMuWsaSD01OktcNxmTHXa7HLLJDlDYDRC\nu7hrCi2utSRRhEm7SFPlrI2bUGmKdMTanTs1Q8xaWFmenjBKhyDq1B+9l6WuZdvFl7K0Oku56uAV\nirjumfdNRS0Wuw1K+RyFYqWXhyu4dFPF88/djjKaVKWYnOXwsWN87ztfZ2xiA9orUR7cSRJrVlYe\n5/LLzubhe+5j8yaPTSN72Lrxp7FCI60PokMr+SKeIyhuuJTlH36SQws3Pevk/083f1lr77LWvnLt\n65q19kXW2m3W2hefyrGs/e5D1tot1tod1trb/73XO9VpBwIr1jALTztGnQpLSiFOO4r/6j2tZYlP\n/UY8488JM00qId61heN9Zay1tPYeoVwponJQTA1ZkGPw597I5Ft/nhkriRD4epltpsmo8OjEDqs2\nI7aWhjRE9NIAVjikjstK3mHPL93IC9/6BrS0FIwm8xRaGhLfIe4vcunPvI2m4+Iog5toBgsFCrGi\n7PsEjsALPDJSAg90HOGIXuNbr7heEzgaoWKkihEq6+U8VIpVGb4RXLnnIjwkmdI9IblUY4wgTTQg\nyOdDtFH4QQhIrOkJ1J3Oh6yVyn/zyBT1xBLpFJVlaG0Aie+6uDZDJAmq20GnKZ7jUMrnQZvT/fZP\nby2GtftpobG6yPDYEINDZTaVE5yjd8OhHzB1560E0qKe4dYaR+L7DsYmZFmD2uIUjYXD5EyL8bJP\nkHbYOjrCusIEV2x/Hlftej5jYYnWyRkc2cWVIzz52BGOPjnFdddehGsVlWJIrNpoacHtokyecvhC\nfCYIclcxcdnfcNEVz3tWG3juMHmOS29LEGv/JNaeClyubSxGU19Z7CXFML2SirVHWGtOG5CwrBnc\nmStUqmLq0rLoOPT/1KvBWJ5817vxD+/FBDkQmkENrTsfZuW+Rym5AQkSN80T2oAB4xB6JWqeT2Qy\nEpP2JpjrEHm9DrzBhuHe3/9Tvv0nf0kfBtdJGVKGfu3iRBmBlrSDMrFwsdLHJBHVVsKQn2e1XiMX\nuMwu1hkYGiZXcHs6alphtEZKcKTBFeCKnk+nVYoDGJWC0Ril6CuWefF1L8YgyKymHcfUmg3CMEcY\nhiRJgtUOWuXw/RLGgLFPVfpaa8lnhmVX8u0DT6KjhK5KSXWGNhlGKVxr8RzotpskSUyWZFy05wIK\nYX5NRtX+q//Qm2AZKTvO2U7cbSC9Mi3t0h9ozs2fJB9Cqs8UrDDSRwmPzLqkWpMrBEgfmvUllqeP\n4+uEtLHCZL/D1skS6wYEkxW4esd21g3Arm1nc+mFL+fxB06Qz7s06k3+5V/uRWcZrIFdldslziJc\n71w6di9N/wDSec2zztnnsJDyTKaGRWCtRq8duRyrqC0ucOdt/8hP3Hgj2AJGepxavNI0xXXd08cz\nKWTvnPy0kXvBtQSTg/h/+TXyA32YSCLcNrEymKE+dDOiqNssffT3OOoJiiImcyypm+LoAOkornjL\nO1nZtIu7PvBOZOMkZAlJMErWrONLBV7AmM7o+AJjBDnr4yHIDGhX43Zb/C/m3jzKrqu+8/3s4Qx3\nqlmlkixbsmVhA7bxxNQB4wBmCsYOYIbEIQmv6ZeG5CWkQ4bXeelOd9LphKYTaDIwpDMRhpgXZggQ\nMBhPeJ5tSZZkDVWlmu94pj29P05JtgTtlbfy3nK21l1VS7dK9+qevc/ev+/vO9z/l/+N2bKLaM7i\nVU5qoMorptMWWdbnit/7U3p//EHOmiuIidGypr2HoBDW1+CFTFg4vsj0lu14Z5EiBmtw3tMdOnSr\nwyuvuJIbPv93dCbaJI2YgRkxHA5RQtZ9kHyDrOgjhcJ6gccihdxsHAYin3C4ylnyoGwgmAopAo00\nResWyucIl+CcIVWa17zyVdxy+x31inA1cRLq936yky8lcSPi1ddeRSodx275KlXep3Ka3HkmxiUb\n2ak8E1tJnBAkcYyvHEoEEidRUhI1BHGkIIwY9lbJ8yHDYca2uTPYs20aqcYpnaXfdrzhNS9CJimX\nXfpKXIj4yB/+Njv2nM0FF76c3edehFIRlgHaPE6TFxGiLk83njlj8E0fKe8qKpdRmgGBCnyB9wOG\nRZeok/CGt/wUWVkfe8JTapI0TU+6iAA/cAQAOHTvIc5/42uZrhzmH29mFEoImiWjOfunfprhjq0I\nG5BuhBYF0chgrGCqlKTWEDtI1Dh900SMpUQKmpe+hnN/5c/pXPF6XFKR6YperPEuJyKwICPMZc9n\nIYZIB+KqYLq3Tog6TL7sWra+7mfqYNCgmSoc20eKJz7yMXpPzLO44klCoKM8UeSwVYUIJVo4NJa5\n2SlS6RGuRPoKYQp6a8tIZ3BFxkTa4i3XXAceisoxyEZ4AaW3oARSQppEeBxBBgT10ck5V1twC0dX\nBm649y5GNsaUJc57DBIfqto8r6ooB32E80y0tlDmo5P9ldNpLlAfxbJig9mzdvHVr9/J4Q3J2OxO\nhDFMSsXzztpDKHunXLfSBkrrKJygdBGF1eSVorQKp2IqL3FBUWQ90gQmJzTeb7C29AhLh+9m9eDd\njLsBF+44k+ds3cqEdySjNZ63bQtnBMn8nd/igVs+z2DtEH6wRLX2KN3uz+PVh552zj5jO4s1FVop\njDMkaQOCw1mHFKBlgowTpIzxCGJqtCCEgJICvIPg6wwPEfCbF/30QIJzRqvc9ZZ3M6MDpZSESFNp\nj1aSmYbi2NIamQoYCVmwpKnk4FgD09xK5/gqM7bHwT97P8fiiKg8SlRIioUFtocVVvI1SivQsUZY\nSxISrGrwo//2F6jOPw+TfILFO77NtErwwTBQgYsvu5hDx49RKYWxFZEIxJUjvf9WfHuOO+ZHHD88\nzyte8hyUHZHSREqLlwILREISnCGSULqSEDQTrSaE2m8Z4xEW/tWLXsrXvvtNqmGGkIq00UAjsMES\nxxEhz+AEr2szyNR6iXAWgadSTb537BhXnj2DzHJQLVRq6k/XVBT5gKIo0LrFtrktLG6sE8IJSiU8\nte2mBMzOzjE2cw6v/8XfwgvH97/1ORpTu/n8336aBw/spUpO1eC7uveI8Q6vBFKETVslSFWC8ZYg\nFcI1wAgQCi01wXtaTWhMeZrxBspV9IYZQSjGx2aZnJihmcRkpSdUx1i4+xHyookrAs30Kpqnucyc\nPp6xxRKnDQSgooSARKn4KQV6QAoAgThRl8Dm2dgj0PXkkLVir/4Bt0nAe3KMV5aSjMiBlQ2Oa8+O\nIrCuDHe//yNs9Q7hAxsiouktWQWv/r8+wPS5O7jzLT9Lx8e08lXOMBJDitCB1sZRHnn/LyKUo5mO\nUYQKoxyi8oxExT1f/CIXtd/OvXfeR8PFOPKaRh4k3//gfyavCmTIkWNtQtZjMg8M2mM05rbQX1rm\n7B+/nvu++0mefdFZjEU5jhTnJLFQBBc2hVJhk4/lkEiMsQhlkdoz1mox3pjiR1/4Mm6972ZcVRJr\n8MFRGAhaIpSGyiM34XqUxLmA9xCcxBG4b36JPXOT7G4ojMvQoYUjoEKg6o9wowFRa5x//VPv4nc/\n+AcgJdY72EQorQgkUUykE6659k1AE4thtHyAy59/CX/yR3/FrUcEldOMR6dOQ+dsbfVqAupEkpcA\n4z3R5pQ1XuCspxUlxKo+VlrpqRDMdlKec+k5yDGLV22iWBDiNkXZZP14QbayRpoVbB2PSaebOBMz\nckOy3qGnnbPPXM3ia5pEXeBvnnU3UZFNmcQpW/pJNCyIk3dDNmniUHvvitP5mg7wCpMoslhz1de/\nxndeci0JA5TpE7kE5wSRrDvwoxS6+/dzpjakLsN5QASM86QIKiXwrkR7UELQsJIoNMm0xOuSyHn8\n8lFu+sP/SCQ8oqURfUshE4IFUXWJlOaSH7uOg3v3sn5gH5U0xGVJSsFMMWT10BHyY6t83wkuuehc\nxqJiMzErBiRaBlxwRFLWGTfS4VxFMLXzhyKBUnHROecz3k648ZbvUGEoipzllXVaYx2cC3WNFzze\nh/pzD5tyYDZ3nCjm5nseY/tLX0CS55Q6QioNroIqo+z3ac0adp95DhEKL+rdL5yA8wkE74iE49o3\nXosiY617hKTq87lPfZXv3/h9XOEwPNnUPDGMqYA6wkKi8MoRRRohJVXlUApUEMRKUBYFIonAQdSA\nqlIsLVUMb93LS372J4EZvOgQSEjSmG3jijOeZTFCoXxOb+NxukuPMVrISfXTu7s8YzWL2KSE+6cg\nJ+Ipfwin0iaemncOJwnh9ZPBo7VG6/iU1yi1IVNN5CteQcDSd4YVa3He4bAoW3F4xxwFisSBK4Y8\n+ucf5Tvv/XfIoqLEUniHFYEgDAmO4C2lrUgLQ1dVbLv+ana96VpC2sYFT2ZyMhNYJ+Ky115H1d5O\nGSdYYfHagfDc/rW/Z3joYdqhoopr7+Du448zYQoev+0fEZViaRBx60PLdE2gNAZHhtUBoUFr0Fga\n0hMFjyYgrSGUFS7LKXob2GzEttYWXv7CH2Xl+AajUcWZ27bTjBLwDiECztrNBVPXLc65k9/bYDgu\nBQ8vLDEsDKOqgOCIdQBXUAzW6a9t4EvD1u1nEDatW1EKAigliaUj8pbx2JINDzM1oVHNcf7ybz7P\nE8vrjFzFRm+j7iP9kKG1Yjis7a1OSJSdsxhjGPQHDPOKoCJKYxmVFTZIqlrYiTUR3/74xzhy910E\nFyG9AilxMiLTGikjRDTJ5MyPsOu57+C8V72Hs1/+nqedsxGWxuAAACAASURBVM+cMbj3PJVRdkIX\ncbpWTfyvHifO3Cfi0wKI01LOyiSwMTPD1ndcgxqV3Pz6t7PHefqRxIcWomrwgj/+PdIXXEaQUEUB\nXwyxNuBw9clOBQahRGkNtsIHS648XVkRmg3iyd20p86g8iU5MI9lVcOOiWlu+eIXyNwAnUTkIWAR\nCAyJA+kiogtegD5jD5FsIJ3ASUfTGQKCuNVhYXmJ+/uSlcJTekfAUpQ9ZCgpswHeFURYNBXC1T0Y\n7TwNodAeQpBMtCZ551t/hunmRL17+ECn0YRALdPmSebDyRtSgNgFjEy5/eAh1iUMhgVKK7Ssez9V\nMcQWI7QXvOtn31kX+Sc5SpveCRJe/IKLGJtIaHc6WNPgC39/E06MEyVtlrsbyCgBebrJnqkflaHR\nSKmqCmMMVVVhXZ08JoXAescoy6hMQBDR7RVUxlO4AucGaKtZevA73PaF3ybkd7Bx7Nso2SMJsp4v\nGDw9vCwJ9g6y0d6nnbPP2GKprMUJtSlJtQQs4AhPWUH1YnA/9CGFB+EwpiQEgQ8KH5261NK3/QRz\nywMe+/TfENvA1MYGVhW0nCeVhmFHsPDZryIuOBOsJfExFsdI1cechUSyaDZ9AbI+LalwISELMesq\nY3k05Lsf/gMe/PgfMl95DrQb9N0IbR3l8nEGZeDsq9/J+KWvIBYKFwKOnDR4sqZk2+wO0l0XMpza\nSiQMbSPwPqKvJNXxgyTWY32T510yxaMbY4RiAykTRHC0Ww1SLUkFpFISSdCiJISMyowY9TcIxhBL\nzWRrgrdc/TYuevbzGZ/cRlU45FM6+CcalCcbwZHGC4VyhqFM+Mr+Y3Slp8xNXTuJgLKGfH0V4z07\n53agkwZCgvKOyNaRHOefdx7XvuxK3v1v3ssnPvY5Du07yuc+93nWSstKr0eqx7BGsLS8fsp101qC\n8FhXkRUjjLU4X7+/Is+RUYyLEqxxlGWJdwaPwyGpbE4aR1DFRJWkVzqmOrMM59dY3/s1RChxIcKJ\nPt4OWN3/JfqPvA8ZVmg1n/+0c/aZc6RUGk8g4OBEzSL4gZ3lJFrsN91DTkomwslckBN0GBNO3VrO\nuOJCVv7k80RfupOiKvFJoC81RoxRnbuNZO8+qs98hULDhNQMCHR9ReKhGySv/cAfMr/U5Ut/8Bvg\nI851jpYQLMSasWorLuQM7ZC1xlm85B2/yG1f+msS50jwdEKgLSVHvvJJ2kbTCJCJNh0DhRZM9C13\n3HETL/rV30eedS5rf/dnbC0sPZ9jtKn5XDrC9O6jx4XcuPdu1rc1uXg6ZrwhIfYomdS9JWtIZIzx\nrq5HdP3ZuHKIKzXaN2nGMZecdxmjEkRjjAfuuwthS4I3IDxaxyehX+/cJtsoIDys9YYs5Y5tqUNG\nEKlQd/iLEYPhBpEyPP/C53HbA3fWjOYkQijHytIxtm7t8Gd//XGQCUE3eGJ+hXxkUFJTVQZkoDyt\nmVwUBUmSIISk2tS6hBDQkULrCOPq99lKEhAOscmORtfw8sLSGuPNJkmqECaQP3qIbKUgVRWP3vxX\ntDWo8hC9FcVEMsHM7DjhyMP46X+pNYtweJfXDNWnbP//CyoZsPmcDyeNEJ7sEivw4P2paNjyu34b\nKwpWE0MQgbTy5MqRvvBCLv7YB1jSHmFGJOUQqx1aCRIdU3gYCsWR+SXWu12KUUkcGUoXaO15Ma97\n34d51qtfQZpKtjdbTBcl87d+i4mNI0ybAcpXlEESBcl0mZOIjFGzzSVv/99YGN9KQJJIwURRcN+f\nfoCNL3ySSTFgrVUwnJjGtafwUqNx7N+/wZ9+9j7yMMntC4p/2HeIlaAY5iXGFMhgaEpoSk8cKpJQ\nkXiD8iXaGLQtCHmPMMpo+cCrXvAStEvZvv0ctKqbuko/6YRf964ksElhIVB5ydfufojFLMcLRyw9\niQpoV5IPVunEDa5745tOYi5WemIVMcodv/vBP+ZNr7+Oa171Bl500QsRSKZnpk/e+EIQVPbUaz4s\nHCYoik3T8RN1VFUajPFYE/AORkVFZT3DosAEgXWCygqIJxmFlJFLcDQxJuXw/ID5ox0O3X+YIw8e\nYOnQFCZrkcuKDT+gXNqgPHIXTzeeuUzJssBrSbCOSEebF0o9SYw8MXwAKfDySVGRD/5UlCwEwJ9C\n4QDIqiFSSyaHFXmkyRWkztC/7W5Gj+wlyJRSZDS8xMYBWRboKK09AKTn1j/6PWTlacuUyib4aIg8\neCuPv/8RoqLgXO/IIo2ioNp3L2PBkqkGggIXt9m24zzWD+wjEoaOaXJ8YUQ36jAm1xgqR1xW6MXH\nkTrDes2y2s62y69AFAus3PN9pC9YJiUKEcp3yYTkvtyzessdvO6i5zEnJMIZokSjEaTUk1sBWVUi\nVZ0K7bzEA27omZSCt7z8Fdzw9S+xtKARosR5h9YaISRlVZM1BapO+yIghKJMOtyzbz/bOxcx3kzQ\nwuHwlPkAX9TGf3PTW1jprtDwIGSCJWb/mkMIg8mGdOIUFUM2ypBSoqXAeU9enRo5sXj8OHlRMNZq\nQiRqJAyJC57gHb4q0FoTC00wDmvKWsYQN0ELghNETiC8wwdBkAErN6iiDrmxWDuGTdbRSkBX4wpQ\nvkCFp4/JE093J///awghQjGaR+kIdAxENTfsxErxoUbLvENIe5LPQ4CyDCRxgtyUwUI4qflyPpD+\nTvvk69z3qRfSN4FGMDznE3/BV958PZNpEz/MUVJjgsHJTU19lHLlX/45n/nQB+jfcSvWOlpeIJVi\nKZS04pQpF2jlEaXMkYAUEUUk6abjrEQRybCPDiU9CZoJ+pMTnNduMzi6l9jGDKOSyie0Q04hGzgR\n0RR5feYWjnxuD4eOz5O2DDNTu1ldmyfb1mG1t05hKjSCICUm5Gyxntc+69ns2tIhji2tuI0JCqlj\nSp9ivMLgcULjRIRDoqMU2WhgkxjZ7vA3n/8o43OOkSspywKROwQxSxs9gm2wnvcZ9HI0CSpUSC+5\n/gUXc/n0OJVKKUKCaY7R3r6LdGKaxe4af/SxP8IHWe9Yqo7QM9bw+itfwqg/4K4HH0bFEWVR62qG\noxFCCA793OLJ67b7o7NIAq1UMzM5wfjEBFrVzVWpIYo1WksiGSFlQAlHrBWp0ggliLREykCkFZGO\ngIBSEikkUtU9vPFUE0uFFBWdMVBKEMUxL7z+I4QQTq8GgGfwGLa8vIKQehOgsXgsIViCACcUNoAX\nGojBJwgSpEyJdLwJh3lqyfGT/2aQp/4fS2J2vvud5L7F0dV1qm1zMD1JoxXjVYEVBuUA72iZnLVE\ncNk1P4X2EUopSinpBYdVkkEJ8twLQJs6I07WvQkZJA2nuejFL2TU0MResBVNWwyYHiyQHTtMhKhV\nmMkUHWnwQTKaPoOLfuE3Gc2dTRCBYAq68w+g7DyNdcH0G66jawNZb4Msy07CusYaLIJVrbjhwD6+\ndXABVTTweY5yFdgKHSoUGdIUCFMgTUGocqoqw2R9yDPizPLGV72MWB+nMz7Ajwa0Isv4mGHXLsm2\n2YoLzp/mzLlJdHDIIJBRys0PPkyxaUwhvMXlQ/LeOr4smRmfYmZsBrm5SKy1VFVFojT94YAHH3sU\nL+pdLwjBKMuIoohG41RjOxvq41lWGhaWltn3+AEOHT3G8uoaWVbiHTjn63CjqsI6cEGSV4bSOkrr\nqDyUtv4ZLxSVDTX/EElhDBulY6FvWB4FVkcwKAUrvX+hyV9TM7N4HzZZtlXdWPOeYApUyMEOkSHb\n9NF1hGAIwaB0jYSdgDuf2rjU/tTFsuPd72B5rMPKbMxjP/9b2I116GV1xIOTaN3iaGeMxVaH4D03\n/szP873f+jWmrKNtFDIofBAIJ8gabV7y3l9nQxWUSmHlZvPUOib7G3T/4RtsHxmkiklChHvOv6KY\nPB+hPJUT9HXgut/9H5SNNiI29ELAzJ5LodtU3jESCuE9KTFR4jl8442MTU/T62cUZVl3tJ0jEJCb\naNNIwB3LS/z1vXez5KEqK4SpEDZH+YKEQEuAMBXSVgRfYKoRohxR9jaYVDu4YPvL2ZZu55Lz5phs\nGKY6hrntTbbv7DA+p5nYKpja2iAISWU9C0XOcJNFgLfoUJF1V7FFTqwifuyq19FqNmum8yZgUGY5\nvfUNhqMhQYF1jvV+j6TZQEf6B+pUD1SulhsgFD5Isrzk6LF59u19nAMHnmBleZ3l9Q2KypJVlryo\nAIW3AWNroMNYR1E5KutwDsrKUFYGkFShtpgrKsmwkKz1LGv9U4m4p49nzmSvdaKTHBOpgDOeOE0B\nw+GDe5mYmKA/GLF9x27273sEpSCOAmONcYKMmJieqjlhIsWegEBPG4daMQf+4rPMbAwIFEwOHDIO\nxCqmFzRdoXjFxz6M7q9z6y/9Ms9aWaFsVizGgqhSSCcp4kBkG0Qq4osf+mPaRiGwBBGQxuB0YBgV\nKKkZDwkjr3AInvOaN9Gc28Gdv/0udJkxJOah/Y8yaLc5Y5ixc3We/f/+J2lR4ILDh5hhJBh3LUq6\nlI/cTdg6U/sKYzHOAoLgLVIoQhXwOLI48CCBjTvu452XXU4qKhqxJ46buLKiEoEkScBZhIgYWY8P\nBV70MdEWdm59Dn5+kfXhfmYmI/pVjnQKH2uUhKQTmAoxK6t9GqqJEi2ObnS5aHKCICVFBYm2jFaX\nQcXs3rkbV1iEDXhpEapmky+sraIaTfrDIVVhmJ6YwIWANdUPkGBDqNkIhfMEKVEIEhkhg0cJGHT7\nrK6uoHTMtu1bSeOINI4pxlpoCa1mjBcBHTRBGKzzaK1qW+AAioDwHiUUEChKSZ51QTz9cnjGFsvq\n8jxTs2eSW0sUaSKlyG2JkpK5neegRERzLFBmI8477zyCc1ifEekEFwTeB5TSSGEJHqyxhOjUBfPY\nR/6W2SpHlYbIKXqJwEmBloGZrMTHCe6hw0zunCAqIFMC4xokwuAiGOVgY4OMNKoa0dh3H4KEECxF\naNB66VWEfA1x/72EylEoh1CW2I547L+/n/T5l5B0B+TCkFjNox/8z7SUphu7ehe0lhAqTBKQDias\nwzb6TGeSo3KNbLVHJOv8Ex/CpuZnEz5X1Li6qZunxwR86MZv8+YrXshOZ5mwdb0VAqAlQSmk0ORC\n4L3De4fJC9JWm3POeBH9R4/RaJXQiOkWEUYZqipG0yQf9Gl1GpheCQhufugRLv7RKzG+RCcprsop\nB8sknQZazvC+X3gvf/bxj7AyWqMKhqQR0R3lDPMRCTFz09NUZYUkIGWMlKcultooXoKArLQkMqCd\nRCuJ8W4zFU6RqIj1xTWG2Yik0WD2rC200xTV9UxNdOi02iSJonIWqSSmqNnWCEsUN2nGirKqjcoF\ndajU041n7Bh2+8034V2FFgIfPDaUaA3BebRsQtBIGWPViKrKQMYoMYUXMcgUVL2j2OE6Tzx2Lwcf\nvpvbT8tTb3S7TOUlJSWlCiQ2MBGaVD9yAWvtDt6OuP/3/wv/8J73oqTDKE/whgjH9Juu4/IP/SnN\n3S8hKiPiqiIpS1LnUHj82Wej3vluzvq5X2YZBQq0kGg0NrZM2mWim75JoQNNNPiCXBv02XO0p7dQ\nBIOPPE6lNC57A/ELXkaXJs9+1fUsyQ5eSFxZMdaZIHiPtR67mer7ZOCpREqNd7VKdLUzxqfvfIBb\nDy3QrRzeQxIkOIOXFa7MiYVDh4okeEQ2JF/tgtnCc8/7CfKiQ5YbjJe4kSMblmSDCo1CWY8NHqsC\nR4oRyxbSJEVLg1Ye5SoGy0tU3Q0i53j3u97F2695K01adf5jNmJqrI2OA3nZx4uSICqktJvRd0+O\nJ2Xkou7/oOhXBUNvqRB4BFJpnLEoKem02mgpWTw6z8MPPcbeg0c5cGyZY4uLHF04ziAvyU2gsL4m\n7coGSsfkxuIEWGvJjaXwP7SuPzmesZ3ltde8heNHjzC3fQdCavCeUTYieDD5KlVVsbK8xLDXpdft\n0ev2CQiGo1XO272HhaPHaUYJ7bGE0ShnbutWOvpU2kT34rM4865jaKsx22c5WA2YXUq47N/9Il8Z\nfQh95/dRpk8DKGQgIJEiINA0kCw2O+y69MXs3fsQUQkOj1CBIC1udZGdxw7xxN4HGI80Idj6Zu8d\nURhnFFvGmglzr76Wo3//GRIMoT1GPx/RW1pjCknhSoIWnPcjV3Hbl27ASag6EzWzWAhE2qEzOY7o\nLmOtJdbRUyZSPYKX1ExfRWRhFFJuOr7OY+trvOXFl7IjihgTgmGw2MrjvMMHS6IjlPeIKsPYgqR5\nJs/a+Qa+e//f45sGHwS2qpiYjOnmGc54tNJYEciThK/fcRfXXX4JiZZUwdY7XplTra1SiIooabFn\nxy5+6X//eX7/g/+N3efuYXn5KK1mA2cD1pon+zinmeydCHb1HgQaJ8Eg8KYi8rqmHjnqmG9val6g\nVDRkE98UlAZW5wccPnScOImYmh5ndmqcVjNhfKxJkiY0ACkFMkhsMCSNCaw7tfVw+njGoOPPfeqv\nWJmfR0iB0BopBY20SafVRgqoTmDpShNFEd7XDaq8yrFFRRIlHD1yGBEFLrzgIoaDIc1mygvveOvJ\n17n1+X/B8ff8GWasjS1KLv+vv86D7/s9dv70G1m7cR/9h29GqfqCeSdY1ZbUS6a9Zz0SxGedy6GD\nh7HCMmk9KRIrPakVCAmZiLACIu9ousB6LBi79EV0u30mDh8E77E7n0tx7FEKM0S7Jgvn7GH37nMZ\nfv2zdFONDo6BalA5QyEqpG/SlQHbjJB5lxVnWFcOr2PSKMZTqxCtqPUt9dGsDhPSCIIHIQNJJOgQ\n+JkrXsJZkUN6y5pLyK3EK40QMaV1qLiBb7aQjUl0u0WvWuSxfbexd+MxmqllfHIC5xsMVz333bWX\n0jZARUQi55dfdiUzShAaCb1hiY7HcV5TNGKkTkk6M0xu3cYgZDz4yL3c8+DtdNf6BC+xrlaAWl9r\n/Q+860noeOdHtm3unjWDIAiPkgqha1RFeIdSkvEkJjiLkgItqF01Eci4BmaClxTeUIW6hhJ4mp2U\niZkOMxMtkkiTJBGR1sTNmKLy/I+Pf/VfHnR89MhhjKsFWwKJdwEpFVVpEEHSbLTxroaGiyInz3O6\nGxsMR3mdex4Cs3PbGB+bYm1tg4DAngZmNHvw3Z1NXvR/vodyNOKu//CHXPXf/wvrH/kyw/2300AQ\nnAdbU0Te/Ae/x9Wf/BQbUcSsC9h9B0h9jb6NIkccAgJHEDlO2PoIgaOIoR+BPWc3c298Gy/76etZ\nNjl9MtaP74cYSCJGWvPKV17JE7d8HaFrsqFwhsTk4EqEF0TesOfsc8gHGQGHkaBF3XuqhVVPhsq5\n4PHBccJfuNrUu1hnKZ1nHcEffe6z9OIYLwVaS1qxRouwKfSq+7kuK/H5BiHPmEi2csHuK7jqwrcz\nq/ewfGSN1Y15XNnl+RddSCMWBFPhUNz+2H76xiKCpNNIN30CSkIxwOVDitEGy8eO0JJNnnv+xVz3\nxp9mz3nPxjgDKjAyGT6ETefRU0fdJK15YpGqXWa820RApcIGwcYwIzeO0nuq4DG+RGyyyVMCTQQd\npRnTCY1I16aBhWH+yBJ337OPW77/CHfdfYj7Hz3C/kcXWDk+/IH38dTxjC2WUVaR5RWVCSgV02yO\noaOUtNE6aYOktaYsS8qyxBhDo9kkiqIalvQeFzxpu0PcbFH5wHr/1LTie9eOc82b38zNH/00ZRzY\n0l2nTAKt/gjl1mtBkRI4bxEEvvbt7+J1G58myMqSWF+7qQRLyziKSpNF44x0iyqAcqJGaqwndZAf\n75LML3PrJz7NlI9oec1sUTI0gs7Q0sRyz//8FDMm41jUoZrcihIBEWxtSqHBh5yVQ48x0xBkCEpR\ns7Mj96QllJTypCrxRP8l+DofxftAEDU861RgNLaFz912D5VMaMUxY+0GiVY1tSeKCdYhfSBkGWZj\nkTAo6DTOYLpxHi++4K2cv/ulTE1Ns+XMGaa2TPPc8/fgpccHze0Li+RCYIoKZyyRsEQ6MKYh8gU+\n62HzLv2l46TENNQkV73ian786mvJ8gHNhkYp+QNomLX2ZFyFEE8miT3VPxnAx5qeN/SKgl5VUApJ\ngccEgXMeHwyaQCoFjVjRbiakUUyiEprNaRp6HFMK1o/nHDiwwAP3/wtlHQ+yjMIYyrJibXWd1ZU1\njh05xuLiIqurK2RZxmAwwFqLlHXI0YkPKknqAFBrLcNRxuLxJVbX1rnpezef8hpTk2N02571McNW\n49BJyi2/9Bv0mkMaZUJRloy8pVTgBwa++m2+9OOvZ7Z0mEhgde2UKULdrZ99x/Vc+v4PMPfjP4nU\nGoNHerBCUknB2HCJhz76O4R9DyJCRTeCJRU448pXU42PUySGxG9A1ORFv/Dr7Ln6bRgiFKDwaBSR\nVsgyI7IGoTVRq0EzSeujl3mSKCpOIx/WLpMC8DhZE02TPOB1yh2PP8FACiItULEkjiVaKjAOV1mC\n90TCIfIeJlskii3ptpjMR7z48qtpt3Yi5lrMnL2DsVaTZjNCOE0/TTiwuoSpHIiAdRWYHBVKUukQ\nVU416LJ+/BBHDu0njjSTzS2cNXM+/+l9/5Wmb6KCPKnhPzGiqK49T7AATngtKK1RSp2UFhTeg1L4\nJKYQkpXMsJo5CpEwdIFcWvJQUZqiZknYCrxHi5iJWDDdiphIYbKpaDRaNJrJ087ZZ2yxtBoxPhgK\nU5CbimGWMxrlrK13OXZ8hX0HD3N0YY1jCyscnT/ORq/PIBthTEG316dX5gyMI8iUm26+lW1zc2yb\nmzvlNZpRinCS51z9GlYbtbl4x0XIODBqdcifvZOXfPoGBueeh8Kg8py2kPSzghJPZDSonEomNJqK\n2bExis44US4gSIK2KJsRbAUikKqUyEe0TaDbsMQDuPwXf4UzX3U1PdGgbSOGiaTvA0v33csjn/0b\nTGFA15JYW5Z4EeFErX/JZMWu7dvYMjeLFA69uYtY506aoMvNRSSokUSCQG7uQl4FUp8ROglHRw4Z\nS2JhEA6ch+Bz2rHF25zK2ZoFPhrSf2I/7dBgbHqK3iDmiovfxkRzD1MzZ+KEY7bVxskMCsF9h46y\n0svITUXpRiAlAgdYWg1NLA2pN7RsSffwE/SWl5jasoWltSFveN1rkFj4ITtLCAEfajCCEHDW186j\nwSCdpcKQikBDpRAihJYoqUEKVgYjNgpLN6tqlx3rKStHZWvBm3YVo7wiK0q8raXSiYDkNF3N6eOZ\ni8nbZJMaZxHUd4y8yImiaHM30YDFR3Fd7KsGc9tmsTYjKIfxgiNHj/Dww4+SxJrtZ+ygcvYUX+de\nUTGsMo70e7jLnsXwngNsyQzLvsFkWbC8NKQFnLlzJ9kjj9Wnv+Cw3hPJiJUIQNAyOYUQPPixjzD6\n/N+SbAwRzhEJgWOCntdMaY0qB/hYYKQg9hFtnXDvJz7D9HPPpxR9DJ5WofGyYPStbzBGxvGxcV7y\nghdz903fpNw6g11dIUokpTMYAlVwTE5PsbCwSFVaXGXQSU3HD84hlDp5XBFSbnqr1apT6wJxFNG3\nA/Y+cZiLZ5+LDHVumncG6xyRhERAWTkSJcjzDO8Ua3sfIp7bgYpS3ChiariTu+/7Jt3+ccoyILRE\n2pKQChaHC3gxQVNJfAJRolGxxJqcNAhi5QheUvZyqrwLjQbn7jqLQ0e6tfnED2kGel/Xkh4I0hLF\nGkGgIRSxBuENthTEURPhDMFHoDwhKKQUVEFgfEQ+smgCcazQ1D0cH0TdqBTgawp7rWOX/0Kh4xP0\nDesszlsU9VGrLHKiSJ4kSSqVcuaZZwCBlZVlwFJZx3duupUtc1v51d/4Db7zrW/xt5/6O6644kdO\neY1/+MY3GHU3WCiGXP/atzIcOFr37Wf27T/D/Cf+nLlRxRff+S7apSUREi88Igg00J/dwqv//W/y\n0d98L53eEGE1TTegtdjDExhqhTTQP/ssrnjfr5IffJS9H/4QbeuoooQ4VLjIM768AIvzjOuc4bYd\njNJJovkHaJY9Mik589XXMpzdTn77bTTdBEM1xApDKT1GBKwWDDcBjjhK8b6WKcgAUtfGHScdO0PA\nB49SJ1IForqTLjQH5hfIzfkIX9se1YvJoWNJFBwj62joBK0UlQmYXp+h3cfEtm1Ixtgzcy5N1+PL\nB7+CrSyuNGjVIB7vsJYUNKxBk2BsiQ+GyNUqR6kUwgaUqJvIwRj6T+xn+6WX0Wy1GZ+aZnW9+wNz\nI4RarSqlhLJgttNmohWYnknZsqPDlmdvpTmeAi2+89X97HtwhcpnKCUwjprRHASogPGWyjmEMWgE\nWmriON68KYIWgPenb3A/MP6pMXlPAH1qmaIJIbzgnxuVVx8dAjPTUzzrWc9h8dgC2Shn97k7Wd9Y\nYqzVRmtFu9VBCMFg0EdHDb79jzcSJxFbZqbwzrPe7fFjP/Y6Dh86yJatpzo7/8TbfoJdZ2zn8cWj\nVD3L0Z2zqL2Hef4bXscjN3+d+OgTbO0GfCwok4C2aU3/14FQRgx0ysz0bsz6w3irWYstjSoAMbEt\n8UIR7zqDvNmgYSKElIwijwoO4RVGO2IJQ18SGUEYeK56/bXc8/H9ZH5AKpsMvnM7x7I1AsusLK2Q\nNBrsOncPS/sfxgsojMEJmNkyw6g7IAiBqyp0EuOecnXDpkyhfkhCAC9rtraWmqP9Xm3tamsSY6wb\nyE3zj4YMFHGCKXKazTbBUMuYM8docZ5Wsk7nrPM4+MA6V132evrqGMNkkdbkLI8+egfLSz2aPkE7\njfKBoqwI1pKkCYmOmIwbNIBIKazJsdUyy8fm6VcVMknxp1GVgvcIKRFS4b3l7KlpJgUkJRSLffqN\ngFuRzIztIo1y3vjOy1hZ6PIXH/weprK1CZ+QaGzt8SAlLkiC0hgfsDaQuwwRINUKKaChxKa44Z+5\nWKjrxytDCE/Vf/6zovLO2XUmExMTeB9YOPYErVaTJ11nxwAAIABJREFUs3aeg7UlOtqCcoJgA1WV\n0+v1EFHCP3z5a0xNT/Hyq67gczf8Pb/1H36Hr/zj17n0eW/me7ccJW6fyl7tNFrcctttRAraiWb6\nnDmESHjksx/HVbWJQalrnpAXlqFMmU4UovTE66t8+dfex7A7JJEWfeYO4vkFouAplMOpiMQpunfe\nzC37HiIZjBi3Bh00MnJ0t55D4/AByibEokGmAknR5c6/+iCJUgQf1SrFwTwaw0g2SZKAcJ6ji0cp\nZMB68M7htEA3IkS/3vkkgeDs5rHrSSOP2gmSkxLhEAJGa3RVQ60PHjvOCydblM2EhjUMQ4wJlvE4\nIbaBSkmkd4xFiqzZwIQxRsvzTHaGHLy/ZCr2FAcPMTO1hTPm2hxfXeAVL38NpfB8+2++hJYxbTOG\nlikhgdGwoF8VjOINto9NMSkhENdF99o8mWhzxo7zWFk+FcXUclO6LKATKcalIBYVQSpSnWAHgdGa\nYGV0jHZLstc9wvRUwlvfeRmf+Z93IrBYoxAqoFxN+VFC1tJkPEGJmiArBAPvEEHQqwzqNKXt6eP/\nTYF/+oHuDdQReWx+vXbz+5NReSGEJ4ATUXmnjJ07d21ChI7t26c5Y/ss3e4SztbFqsVThsDy8SHL\ny0OGA8G117yNV77uDQyyiMktu/nyV77D1m2zfP0b32TX2btYWzp+ymvs3XcvIYwIvuT42nGiKCZr\nB/KvfZ2tx56orZMk4KGftHn7Z26g+ePXYuIG2hWcuTZkxhhkGtHrrtT1gFAkool2kEclM8MhZx+d\nZ1vWo2EdHku+9RzO/fUPcNl/+jCRiBDCEKuALj3NAJW3uNlpMq2wwWKCx3qBCp5YKdaHA6wQCF8T\n/wgBFUckaeNkWJAz9immdqc638CTGfTG1l1pqRTfvPdOBiKgXcCoWtIbfAAsSaRrp34ZaKaqVnk+\new8+jsiP92iZZS5+5Yu4/9g+Fg8+RnbfClMHtjH/wCL3PHAPr3vzNQxlycYwo2sqcgsrZsSBcoXV\n4DjUXSevciLhUc4To9g+O83rfvSVdbLYU4cSSOFoBs+2ThsRHEbKGp2UoEWDdrKVIw8uc8/NjzLq\nDskHfVw8z6/87jvYds44Kgikc4gTxucCtKpDmrQUxFqiZCASIGVARgojn/4c9k9dLIF6h7hLCPGu\nzb97uqi8Y0/53R8alXf+BRfwt5/8JFk2osoL+hsbTI6N48qKfm9EXjluuv37HDp0kCRucPFFF3Hj\njd9l8cgRytGAs3fNMjcniVyJNkMOPfIA/dVTF0tZDKnyAmxgIp2idBVzV1xAq1fVBR111xutSF1g\nQwYuv/LH6Jclg0jghGGt4bGVJi3r862NE9aFYD1q0og6lLGgTD2FDAy0oogcUhtmRcWhh29hFIo6\nWwVHJRwDRigds72xjaoyjLylCAGLxRiLFnXMhNX1BHdFVVPzlaA1NbYpA9ZILakqc9Lp5uQF3YRZ\nT/DH6h5GQIbA4UGGn+pQbqzTbCqUTvBOEKeCiVYDIVMsMOgdoxXWESuHCPSpJgvyxLIyKnmkyvny\nEw9xw76b+NJj3+PAvoOsP7TIA3ffwfjWSVwaUyHoZZYjgwFrbcP9rPIIQ7679yAb/YzIebJRH60E\nx48cZktr4pTr5nSAYDZTlB2NRkQjrg0qjPdYWzDekbzq6st52SsvZvdzzmXb7j1Mb3s2E2du4f0f\n/I80GjmKTWQuOLw1CGdJtKQRKSIBka6FYomSNHVCov+/gY5/JIRwCfBa4D1CiJc+9cnNDJan4838\nwHPv+5VfI007/N3//QUOPHGUlZU19j92gNwG1vp9Dh46zGg05GWvuJKrr309Dz5yL89+7i6+8YUv\nMDfdYWpM4aoew/V1dPBs2zJGdJoXkkTQTBtEcYw1JWdOTDB78YUg2xgiqqA4kErGRYOGN/zd9dfx\nyX/706Q+w3hOevY3jCbO6y6xQ/DSX/k/uObDf8JgaiutkGKFAhchraRRSZoL89z2yz9L94s3kDhL\npCTD9jjn/uzPo2JJLgTsfQKV1kZ5xpYEZXFaI0JgQK21iGWond+tR1PTPNCyvltaTxxpwBNE/Thp\no3aiaXdysbDJr9L8wV/cQJ/AysIiiR+iQkHhAlIUhJBx9NhBijJn8cijrO5/gFBuIGNBU2iqhRV2\nbT8POzbJfNJhrxuxsl5yyc7nIvKY3kZGZkoKA3npcMJRaEtpRxhTkeWO2+9+iN5gjaga0D22wM65\ns7jo0gtOvW4+0NIRqfJYU2GsA2fpNBPSsYTQCjz+2N08/sSdZHaN9f6A+eNdyqqNjrZwfOMI//rn\nfoGgA0kkibUgitQmC8DhrEWFmvZf5hXr3T7r3Vpv83Tjn1SzhBAWN7+uCCE+R32s+mdF5f2b699C\nXpYsHl8k7TRptcfY++h+7rrjDgb/D3NvHmX5edZ3ft7tt9ylbq3dre6WulsrsmzLWyxhYxxscAwm\nxBiDk3M4h7CETEzCBDJMksMAw4ThzCRxgAMGQzw4hASwWWxjbAOyLW/YkmzZliVr39V7dS13+23v\nNn+8t0rdshHMkHPE+09X36q6dW/V+7zL83yfz3dW8dIbX8Kx9YNs7W5z3wP3MFjSHB4e4dj3vYXz\n555Ca8XuOPVCCCkIwTIcDi79IVGQZQbvO0QueGI2Jm86OtOyVAdmJw7w9//j27jzB36M6fQsG75E\nRo2XGYUPTAgMraBWligVQXr6Nk3kc7kmrA4xp86gVUZUDUZUdAwJ1lLGLjl24al8yfPe+L3om9/I\n+Hd/h3W7y05fojpFPLpOdeYphlFy3Te8kodu+wyVEogQsTjWeppxN2X14FEIJStHDvDwl+4jRyaL\nCSCGZBEYQwRxcUPc01YQRIdRmi1K3n7Lp/i7L7ieA0sK7Vs2t04StcHrjMwEpl1HJLI1Ps+5s6e4\n/CU30PiGEC1XHTnC9ctPce2xF/Ar7/sEy4evJWrB+tIy60sDwgHQaLY3t9g6m7OzPWFQ9hBe0imo\nYuCpzS0OI8miph4vc8XllzKGNYZcpTS4jolEaXqC/kihRjltHrEOxhcmdHNLORwgpOJc8RhXnThB\nf7TFd/7Dt/I7v//fcZOOtvPIqBZHTpV+Rz75kS4vLaGMQucShWZr51JI+f+nYBFC9AAVY5wKIfrA\n64CfJVnifR//P63yJlVN6yxHjh9nuLTEbbfdzmu+9e/xu//lXVx11fM4fvkGRw5dxsaxK3DacO/n\nP8cLbriGJx9/kLPnnmI6aVgerZEVgq6r0DqjbS/dWbKswHvIdMGsmuM6Szi9RacFq1rhz2xTTxv6\no2XkNMMicApCsFgvOdvPCC145cAICmFAwu2//A7ycoW8OksrHYXXWCmYRyhlAisgRZKgoBiECfd+\n/I85gqNwLYUX7Fx1BfGs5eqXXsn5D58jC2nib2uPj6AQZFnGddddzix4Aop63iBiJO8XdNMGnEca\nhfMBpdUlJqh7AEJg/+MQIlV01CHn3Xfdx1BZvv1Vr6CwMG9afLPDNSeOM3M1VRU4eW4T3VgGIpJn\nmsZblvuSW794P194YgtMn8NLBdFJvAgoadMxUDasHVzm6w++FJdHOtcwOT/l3JNnmdcVp2Y7eAkH\ntWL2uGL5mqu52C3EIcjKHoQ67Y7ConKF6QvykUwk21ZyWK3ROIubN0Qtme+eZbL1ZQ6sDtk4MCK4\ngIgLpXUIZKZA6eSPoWJO6yxV3ZApRSYXjm7PMv46O8tB4L2Lc7EG/nuM8c+FEJ/nb2CVd/99D9BJ\nOH/hArPphH/zb/41y+trvOZbX8vx41eydX6LzfEuG+YYs8k5umaXj3z4g0znU2KEldUB8/kuB4aH\ngVRbeKYEZDqZpYyQc0gfaYNl0Ea+6V/+Kz77ex9g46FHuf1HfoSN2uMxBBJDNwZBfPlNvOEHv5ff\n+4mfxbtTeNnDe0mnPUu2oqznNLGhyjQqtBTf8Dqe94Z/wN3v+BXCow+lFyAgykgtJYNHnmD79Nsp\nvGRcekIrecWP/y98+pf/Lash4oRl8vBj7MYWqw1ZBCEFTbeFjIZlPaJpA5PYcv0LrufOz92JQSOi\nIESSA4FWl1hwxJjQRt77BABREeU6vBLUUqNkzh9+8vMcWVtjd7bFtz7/emTXEWyHlZrpZMZ1R1bp\nSU2n+ui2pV9OefGrr+LL901xjUYGh/QCpEGIgJB7vGqLio7SQRCSlctWuOzgCCGgax3nzoz5yrmz\njJqAWF2Hi64tb/rut/DwF25jdvZxlAmILJL1Fbp06EFA6ojWktAJ+sNBauCK4GuLm57iyIFX4F3B\nYClnUjVIFVFaEYIlukToFyiyPGNeTSn6JVoE/ioB/l8ZLDHGx4AXfY3Ht4Fv/ku+5+eBn3+255Wm\n4I5P3c7qxjqvetUruPeee7jry1/iyitPcGjtELatWT+4zh2fupUnHn+crmmJQN/0QEDddqjcMJuf\np+wt0XWeumou+RlNl8R4TRdom5a6nTCbNHz2vX/K1/3Q9/LgT/80y00CIETVEoPGuAqtCy6Md8iU\noc5AdBb1gmuQ65fBR26hzTQEnwyFosfqHlff/ErOD0csn7iKM08+Qa+NTDNA1gxbRZcLZBOI0mF9\nn+GFTSZ/8h5WJ4HzwTIUhs0L52h1JI+pSWmpn7O0dAAVI67zeG25auUQJ+cTjh8/wsknz+C8QWHQ\nsqWzPtEaYyBquZC9pIUEQgJsyAWlXghsCHQCHtnZxpnAXacu0D+aUVVj7txtOL27ywtvsIT6KKYH\ncRrJVnZQasyVRzTLQjLoGZSyBBfpnESrVHXPqfFEKJexTYW1EYencZbRYI3Lj5UcObRM5w2b5x+6\nJFgQnmNXHuOeMydpcfRKhcgt+bCHlwJvLZky+Kwjyg5hIkYojDP4eJbV0fOS/0wE1fMc6OccPbbO\n8lqPldUR11zzdXzwT+7l9k/fxaEjfdx0hDO7yL8iHJ6zCv6dX7yL2bzGlHPe8wfv403f9UbmdcvK\n6hqf/vRnePnLX84HPvBhXDXDZBlFWaRjjVRkWYa3Ch8sSMMTT51GKbOw3nt6PPrEk2itcd6ChywE\nQhbJzj7F53/+51h2qckpSChDxlY+wJsKEwJLTzzB7/2zt7LsBXk25Mbv+sd0oyUe+8qD5KcuYE1L\nI8AjUXbObb/wn9Crl2E2H8OZGcEJMlHg45BZ5sh8wMmARlIRUHbG2S9/FhkDayurbO/s0PQzYrug\nyuuICx1bm+fS/53n8KGDSOVY1oayt8z2+bNEb5hXXUqFGk3rnk6X7rtvXXI0e3rsmxHESAgFX3nq\nHKfOnCNoz1j1qbTn1HpgmD/Ikt1hPpS44S6rKjLIBGujktPjuxGxR8aIkRmCU8hC0+JxQrK5O2Yy\n32Y6d7Rdi5ABxSn6xrCxvEKhJb380jyTcFtsHFpFj/o0s22cgCYKxu0c00s25WLPTl0K8JEoA0FM\nkSiG2dV41/Jj/+ub+f33/i7KBJzdYnO8zdb4Aibr8atv/01OnbzAT/zYD7LrxngnWF1bfdY5+5wF\nyxMnT3P9DTdQDgZ84yu/Gx8C993/Z7S15fjlR/n99/zBQu8k6DqHx5L8Wix2XjFvG3qDPrPJDlk+\noK5bXF3DxXd8IWnajrZpiUowlJoWQZjscNx5JnsTRUQ2Vwa89hffxp2/+du0H/sYVjasCcnpQuEQ\nfOQn/zUHXvpSzu8+yGGdEYRBEAhCEAwMwxRxviIoi2l7RDLCza/k+u/5Jzz46/+e7N57mGpL2SUw\nXBMrSmVAa0oHVsNMBhAGHRVCBI4fP8oLrrsCJaGfl+zs7OCCp9qcUBZ9br75Rj720dvJ9ADiwrBV\nBmQCSxHCxccyj9JPp5VTEMFoNGJzc5MYNFVUNCIilCGrPLlSnDs959D1c6bmAmZkaGgojEa5hhAd\ng0FgOpsx3arZPPko12wc4Yl4Ad8LmLU+Ki+ZCMWjm+cZz2qMEpRZztGjB5nUu9idGQdHlwpgZbXF\n4ycfZ+OyFR6+/wy11wgrEGQEH/Ex4G1INh5aMZu3KJ3R7weOX3Gc4BW//o4f48GnbgFjMYM+y70B\n4x2HazMefupByB/hxIkbePcffYh/8Pdfi991bJ7ffNY5+5wFy4kTV3P8yqv5xCc+yYMPPsqpk0/w\nQ//k+zl/+iQnT55mNqvo9XqIIIgxYLsksmyrmrppWN1YZ3c8oZ23ODelbR3WtnDt0z/DBehcQOU5\nMUQaEcjmgDPMTSTUHkwC1/XngnFjuOEVr+SOz3yGLgS09VTRkSFZKyThjvtYKQtM6NHqjpiqNEQv\nF/awHmU13jQQW669/uuogma36HEogBWWXjCY0FAJjY4FLgiyaUPZL9jWqZHKEdExgeG0jnTWMx7v\nsr40YquuyPOS0WAEIfLqv3sTt/3FXakPXwUKndEFj+va/R0l3Vsi/X6fpmkWPTFJ3Tse7yadlPXo\nQoMIhC7SmBYdSp58xHPN1Yq8DEi1TSn6OK/x0iN8ZBAlcxeQTc1y61nrItsjSe+qPnGtA7nLss4Z\nXV4SwoBev4fODF3rsVi6qs+ZR3YumRuxahkpSdnTvOglL+TkucewoWHWRro6QAhkLunj5rtzpMlw\n1mKdZ2NjhR/9Z2+hXD9LP78MpRRnNhv8WkQITSCgdM0H/uz/5C3f9k6EX+Edv/5u/ukPfBczf+kx\n/pnjucO3di20nuuveR6rqyvkX38TD375LmaTMdniyLW7u8tgsPQ0qjV4qqqhaTvq9gJVNUcKiVtI\n1xN47+khtSI0yeVXCYmMkkJ1XMgtq0S2csG3/+i/4P2/9HYGfsyXfuSHgQg2IrXHeIMIEi0ColNU\nWUXPGayoaYLk1T/5s7hS8ic/878ztI4sKBoVMWGIjp6HfvtdbI/ez+rWDo0M5E4Srns+R190GY++\n58tc9xs/x31vfSu9kLHU1GwWkagDhhwhHf3+kCha+sOcyc6UaZQcuOIoL3nlyxmNRhw5foQsK/iZ\nf/XvuO+eJ3A+IFwyVXXKECWp+q8kSknm8wS+Tlbckkyr/VpMlkHEYbShix150We1v8qrX3EloT5J\nVE8i85KukjRzC0IQWmiaDtloZOs4cOzr2C236B1zLI0qRoOMkJXM5g2j1bSgeNUh+gUxX8O2AR00\nV15ziFvOfmn/73bv2OGDQF+omO3s4jJPPkgNcHLeoZWkDQ7hEowxdh2dr5E+4xf/w8cpS4j+ALOm\nYXPsmcwV2+d2cF1ASLji6gGf+ez93Pqxb2Jt/TjHLn8Jx68/yBNfuTRonzmes2BZXzvKV+5+iCOX\nH+GBB+4mkxLnkv9GFC1SKrRWyZPDOZxzyShUmOTX0XX7JMPYdckp9xnB0jYdRmvqpkEpCc6xpT3Z\nZUv0H79AT+Scvupajv/Q98Cv/Dd2lMOKSHBQBMtEe3TQBCWIQmCEhugIQmDJGPcN2WCJVhVopjSi\npt8mpXcnLWXruPJJz7g3J6dgmh/kJT/844x1R/zgFxGnngAME2VoRUBXXcooLWolV1x5lNe+/pUc\nOnyU5bVVOuvIyzJ52xCYzyYUWc5/+H9+g7f9X7/AR/74w+BTy3BZZtB2+BAWXLWIEKkrcb/DfGEC\nlWpVix4YF8lkxjfd/DIOH7iCXDa0taA53xHzGd61uMonkEQbcTNLZiUezQMXHufqF2asLWsGKzn9\npZy6c6yu9rE24oUkCs1cJP5xr7dE1U4Q6hnW3ja9R0RA9ErausY3HjcOFIWgMElN0TUWqQ3zqkbo\nEm8lPRPROhBbSxfSMbmxlsmFSOwyfGzpDQPTSUXZGzF5/Axnz93B4SNrPP7AeZ5tPGfBcv/9D3LZ\nZYfY2TlHZiB0dqHTyZBCY3S2sEtzqUtuYRudXKWS7qltnwZKd12HMZeqRqVImE+tNVmeU1VT+mh2\nDg9R44b1KnDHj/8MZj1juVDQOISMOGnZLZc4ePNLefjTf0HmWkTs4ZMiD+0dB/2EL/zs/0ZjFEfb\ndP4PWjMVJV7mjLQjdjN2+xbdWRopMT5y5/s+yPpAo+KMT/78L3BQ1vRe8EIOLK/Q3vbn7IiQahal\n4B/9wPdw4OBBhEj96LmBaVtRFCUxBnqjZaLz+NDyoz/xz3noK/fz5EOPYazCWcvQ5AQl2K6m+xd5\nIcR+4TLVXtICoxDkRYFSire85bsJXYVGElxBUWxghObs6Udpmm1KUWAbu2jtzvGdZB4DXR7IDiiW\nlmB1LUfnisIbrA1QKKyPNDbQOIt3Hc5PIDbY7tJJat2CKyA15MvkCupqewEql0Tn8Ca1mPk24kQf\nomF99TK6+nHKfkApyZLWqV1bCZb7FhkVQpUoA4hlvCvxUTFvKh55as7LXvF3uOXWu/7SOfucBcu1\n117JqVOPM1rpQ7BPZ2sWvRlt2yCESDC2vR5sKWmaNlV2MQuebiLA7/m0XDw6awmLjkLvHa23lI2i\n6RVU1x3l5N2PstSeZunxnImK6IWQzhnJFa96Heuvu4nb7riTvk1ixzIoOqDrL2EcHJhLdk2L1wYV\nFFZobvwXP4o6fphP/7uf47L5HC0iUWY0mWPoW9o7Pso5UTNmjiFjrjyxrzm/eQYZBCJTCClBBkYr\nq7RElFwgeqKgNH26xlKWJV09heCZzicsFX1+9Z2/xpte960415EDXdcSFCyVBfO6wSejToh7bl9P\n16WEELz4RS/mhS98Id6DyDNEbNBGEHyCpx840mfrwlm2zpynrUSyNfCezitarVhd36BXnKEoapRs\n0eRkRUEsA3Vn0TEVTvMQ6byjs+1iJ7xUZhIBIR0uRpxSaLXEQOd07ZzJeJuitKyMepR9jfOK6DRL\nSxssLx3lQn0KZSzGOFbLIcsqo/OS6awhErDR0FmNswYvMoQXuOBoq8Cj9WPPOmefs2DZvrAN0XP6\n1FMsDYYoo8n7vQQqiGHfsTaEBB/Y68XXRtE0DXHhhZhniasr5dcAH/hACKmBqLYdpcqookU1lt1M\nMbj6ELuPn8W1DaWUqLiYTA4ev/UzmLWcoYdoM7QCLzzLr7iZq974bdz1rvewdd/9qOARUUK0NMKw\ncug4lR0yD1DrDmKJloIyZNAGfDEh+EBfZVgRqK2hvuM2diUoWl50083c/vnbUViyIqNjRowCGUBL\nQ5QlRSZp6xYhDHmZoBOzyQ6f/Mifc+TEcR75ykPEKHHCE12kUBnlcJkL05qIx7u4uLcsgkUKvu8H\nfhApDMjUteqDR2OgnWO0xPYkKMnho1exceAY7YJBVo2n+CIn6/VQypOXY2JWY2OLCQFhI6LI0DIl\nXIRwaKlpm9TcF5zEi0tNhKIVeNkg9BATQyp4lob1VclllxUMlxuij9BJ8t6IY9e9mNNnx2yebMgL\nie8aulowljlKecp+yZoqiQKC6TGpArsXGrouMBiuceHCeRSC1j47N+w5C5YYLHmWsbJyFOE9jsju\n1vY+2WN9fR2lFEVeUFUVSqlFwETyvNx/bG9Yay+Bz0Ein0gpqKpER9dK0MsLog8opdkdZhQ3HmP2\nyGnM2QlGa1RrKbxiw+1y8t1/hHYtjdTUymFazyOf/yI3vOabyVdXsF3AFxErA7lXDKPg/T/9kwQd\n2QhzoshwImc7BEJ/wNA3iGgxQlL4SCcEXWhQUaFthzKw8+g5FAqnUi1B+wwjVSIn1g33fekTfP7z\nX2Rrc4vz5zeJPlIYQzWf0xmDjYGjV13LIw8+ntLJUmKdxYbA0qDHvJrhYwdi4bYWIwrJNSeuY3tr\nStskcHYVxujMYH0PoUD6Fk3iJ2elRGXJYGi4tAK6Rys80dfEUBK9ITiPwyOMQziJlmmqiRCQPqCC\nTBwBXyfLw4uGEB6tcnyEzDikiBw81HHoqEWomtbVmN6A2bxjtrnDePoZBksZl195mEwuc/5chWtK\nzm97DmzkCNHSy/soZZDaIEtDk4Gzkel4jJSCxkWa6d/SYFEsiIStI5PJTHN1eSWxsELAWsuFCxeo\n5u0lwWMXGNNuccFvmgbnHHmeY+2lF0XnPM61GGOS5EMIrLVkUiXH29mcqYB8vc/w0Do74xmzR08z\nMBmnRcvBKjIrIiYKZg4OSs3VXvDFn31b0hIVHicEvaDx0qJixyE/RQVJVJ5GRZT0vOLHfw653OfO\nt/8S4swDSJeTR4gaGgQqeMBSuwx15glGhWDaCv7zv38HUXXs7uzSzwpCCGxubTGdtDSVJfrIaHWF\naeOo60C1M6ZtO86fPcWBtTVOnUkwcWNMUho0c4ZlwbCXM57VuBD3W3g//7nbeNU3vBoRJds7E+Tc\nIxAgc1ywoDQIhyQdd4UCrSRap0VMq0h0OSFs4O05vJY4HEIojNqT3yh0CGjvEDbQtTUhdOnYefGQ\nAaVTEsPQcdnhHa48IelcS+dzCGs89tgYv93RNYLIBW540RHGuyepK8+RYy/gngfuwxjNzqQhWkMc\ntvQGQKcItcdIRXQtWJA6HeeV7D3rnH3OgkXLuHC4VbQxmRdF69IvXwi0UoyWluj1UgA0dZV4YURG\noxFFkSX6PNAFi20c2lxq7e1lRGcZ3nqG5YCt8RaZNnjhCdEnYIH3iCJjF8Eju1sUaznloQ2iEJwc\nV3SbuxyYe5Ca1hh2bJXaY+PCFru3jDh0BdOtCwzmEwQBJx0yRkSQONnRbNesZIHJdJuR9wTl0T5B\n9jopcBFsNFgBuevo2x5tB6efPI8ZKoKD2XiclMhO0e+NqKotatcwPfU4vX7Bt7z21bzyVa9m+9wO\nH/nIrdx7z/3MahISl6R7MkrjO0uWGQ6sjJhVDfO6BqH5wIf/lE/9xWf4R29+M8eOHWH9wAm2N7cY\nj6dkUtH4GVEoVF7s684ApNLIOEOLgBWOyCHa7jSRc+ihJHSGIAJEhQst3kViULRNR6gjKHAxJBDh\nYkRliUi0gkEZuOYqT17WxLZJkEMEV11zjHtufYTJ2BG1YGfcMGw0KhOYYclV17+Me+/6HL6T+MbS\ndIYlB0YEbJWzu9Mwq1P7NTLSK3u88OY38NFkXgm/AAAgAElEQVQP3PKXz9n/0UHw1x1CQK9XJrfZ\nmLqfpZR450EK2iZ5kiiVLvzLKyOkTIacbduyuzshxkhZDsjLbHEMu/SC733Yl69P5/PU1+I9eZZB\nEEgkzjmqqmFaNfT7PdZXRtjoEFowOL7B+uEDzO55lG635sSRI9QnzyFcIGYa7S2tc7z0rd9PowQf\n+6n/g9VxTWciMshEyg9wz3/5NbzqWOkalM+ZqkAXPbtCclY65sHTy3KyCNFIOhFokZza3qWcWbTU\nlGXJkUOX0QXFdDJlaeQp7YjgA+Pdc/zZhz7K7//hB2nqgPOgZI7QCpMXCS3kFyA+wHYdzloG/T6Z\nyZg3FmcddV3zrv/6m4wGA77l9W/gxhfeyPpan7qqubAdaa3FRUX0DhbuXsl7MgPvkUYiYg/RHiMI\nS1tNMRo63y48RMEGg4+RLC+IUbA7nSeE0kXBopSk7PUZDg0njkV6g3NIodjor1NZgWwDs3aba1+2\nQjPXCJ1TDgST0w02dpx57CyDpcMUvWV829J0FqkkIXbkAqpZw3g3Musi6IxipFheuZFKPKPF4xnj\nOQuWLEu/nbIs96XldV3jvKexHb1ej/l8jlYLR2LnaV06kvWLkkHZo+06kNDUDW3bMpte2sudS0Ug\n0HUdUis6Z9FCMJnNEkxagJSKyawCKVjfWKVQJnmGAE0XUVlGcf1VNHc9ygNPPcUhmWGUIkSHkILM\nCj74f/8qL3rFyzBNg5UO5ZPeLEgIPjIMW9Qu0klJqyVzItu55JyMnPSWLhMY32C8IDgPSmP6hsHa\nOkc2hhQm58H77uOWez7OvNOLNLlGSJM0YFKgY6SJCiWSetr5lKvQWY7Uhmb+9CVaCAExUM+m5GXJ\naFBQ1xZJIArN7rzlj/74fXzozz/M61/7zbz4RS/mxInjbO3usr0zI1oWZB6/aDg1KC2QMiP6Di2P\n4zpBxyZN3EQUJYgG4XO8ULTW01nFfN7hXImPlx7DlpeXkTJxzXrlDkVR08tXUSan6KAw0M8d3bCl\nnmsm0watlxgdvgzfrbJ1bpezJ8+yvHKAJx59mEGWAxrnBU1wdJUjxB7IiCwGkK+wduQmmtB/1jn7\nnAWLlDJ1z1UtKssWl3GZLJ2Voq5r8jynl5lFKrlFi5SRwSdqu4gxNVplhjIz2Gf0cteTGUWvl2os\nbUtW5MxnM7RQdHVFnhdMtndYWhqRFzrZl8aICAYpPaGxzH1kMBwQVpdotsaMW4uWGUYIHFDYyNr5\nc5z88B+zZDukl3Q9ResdndJoCy0CKyIzGZmJQJCas6LjCdcSVI72giCgUwInLFIahIU77/gStwmP\nVgrfWIIL6Mygsz5t8MgYFhBKgYsRjSSGZPgToqcNC56yUCyNVuiqOd3+vS7B5WzTIrPIoFBYF6lt\noHaOUpS0uxXv/+AtvP/9H+IN3/Z6XvKyl3P5sRVmdc3Wzg51XSfWmlCI4EBZBAahBDpcT9OsMJ8V\nZK5GZpsImYqEXRCpLTpqPNDM1+EiDWOv7CNVMl3q9Qy9XkahwRSefplT1hnLITJ3JTbXFAjOb3bY\n3ozaO/zQQOYIUYEssF5Bk4rZifxZYEyPPPN0eo0TV72Wshjgnnl3esZ4zoKlahsEirwcYNsaLZNB\nZowRLWHYL7Fdh/eWEAJFkVHXNVImV13nHc62GJN2qRgFUlxalCyHJfNqRt3M6WzH0miIWUhth0tL\nzHbHaCE5uL7BdLqNUoK6nmN0QRcizgdsW1EFWLpyg9C11GHO+dDRGy5x9Mgxzt79FVZiRFiQIuJ0\nYBYzXv+2X2bWOf7kp38CbRusi1QiMtWaR9qGcR6xWiKix8k94ERyFQ7B4YHdabKv1CqQK50kKSLt\nPkomp2cUONcldbVMzsJhgTXVimT4JCVKCJbzFaaTKU3XJPffVJ/Edx2dSiwtrSKFU9RdQxCS2XyO\nlIr3ffBDvOe97+M1r3kN3/ya13D15Yep5nO2treZtQ6vFUIaoooEmSDho/6AQJ/J9FG0j2SmpovJ\nhrtuBVXXZzobUeTLwNPo3f5ggJSetnIYE8j1FrmUaCJC14h+sgnMrWReB2ZlZLAC484jyZHaE6qa\n3PToFddQzWYEBXlskjVeKGkYMhiscuTQlehsQAjpPvZs4zkLFgiphuI8hIgqDDKCW7S/xhiJStM0\n1X5K2FpLjG6fWqmVRghP1yXruGcGS9e0aK1YXVlNF++QvrauG7YvbDEoSg4eOMDuzg5CekxeYLLU\nLOVCxMeIyTKEEGxOJ/SvPsCgDUwfPcXubJeTXxmzKgIiM6goUU7QikhnCp588DHuefhBmtZhVGBW\n5jwVLGeqmqY0NARilAmWt6ishwUEHJLL12Q2ZXV5gJYKSSqsLjKwRB8RCxewxP5Nv0+lEis4iSXF\nvoe8dYG8ZyiGPUIFXd0gL8Ioee+p65SON8ZgtGE8b2jqBpUZvJdkWcYnPvlxbv3oR3jxjS/ida97\nHZcfOYIXipNnTlO1LRIJIkOqiA8tg9FljEartN0O4/FpNs+fZnuyS5AFK2tX0R/2ybL8EkrDyspy\nUpfnmugmyb1aO4QIBKuJ0SJNhs4VzkvKAqrWYoKnCjVYRcDSeE82uIJ8eYAIks53hBAxIadUGxg3\np4w53YUZxeUbxPi3tJ9F6wxnU5pYGU1b1QsZC7iFuG+PtBhCoG3b/Sp/23YURSJxRGA+q8iykqa5\ntBLsncPajmI0RCi58GVMK/fq2iqDvMRaS/CBECPNvEJGjQsRgqdtW8oyfU2Wl8yEIxaCK296MQ9+\n6gvkQlNFjw2O3EpykppXTy/wuf/8NjolGCjHAz5y3s3ZUpGqr5P7VEivPoiYGAIX0eHhaVp807QE\npwhSorQgOIfRGinVQkOWjq8pQNIiopRavPZiXxrkpV9o5BR5v0cvL9jdTsJBIQRGiKSfI4K1BB1Z\nGpRopZk2FY1vSQEZKDLDvffdzZfv/gIHDx7in/7wW7lsfY2sKDi/dYGd2RgpM7oYkUoSXU5e9FnL\nDzFceX5atJzDxgEej/cWLrpu9ns9DJFGS+paIiiI0RPDAvvrOoJzxKgJUaJVhpIeGR0qKqyVaCSt\n93Q2A1YwWYbMMvLcsGZyPvPxD7C7+TC5chw4fDmvPvqjqL+tnpIhJkFf8CGBzxZSchcdxHSnaZrm\not6LNJGEUCDjIvsF1nb0ByOMLvatKvbGoNen6xRlUWB9h5KaST1DKU1mMpquRS2OdN4F9u6ZYqFk\n1lrvF0l7ZomVkFOHji8129xbKHptQHUtAxlYNZ6+NAgEIY7wtJxp5mxnEGKBNRKPRcWACBLlwCuw\nkr+Ug5jI+REJ5IXEWY/JdYKvioX2bdHkJaUiXiSMfLqIK7HWpqAxGh9ShlApycaBA4wnY5qmIVdp\ndyImywpsJOpIr8wwxZDZfEbbNYsFTCGEIssMF7Yu8Iv/6T+yun6AN77pjRw8dIi1tTVmdcP2dErd\ndkjZ4oVE5QP6eera7IIn6B7eB7q2uSRYyjyjq6bkqoedjwh+B8ec6CJSdgThkVoQfMA7BU6Ab1Nb\nuMpQQiGCgBqEjXg6ApaslzMq+tx/+ydx2ycZqhal+6wfeD4ipKP8s43nLFhsSKu50ILOdjgncEIh\nUMRosU2L1hql9kDhyVDUuUgUHutbQCFNn/FsxnAJdHYpBV1KjTKauW2JOGYLJtnRw0cgpABtvUvC\nQrFnfWARSuGcT1k40ufaWQVlQetDEh02LvWNiEiDYleJZKYjwNKg0MiiSAtBDIgAIhqChCg8waRd\nUcZkifdVmIIQFxbXEaUDNnqyLKIWNgxx0RiXK4nJc7y1dNEioifLsiT/iQLrfLpLiNRvExEgFFYI\nbIxkSyNillPPZmghKKRCBU+Iks45gnPozLDW71NpRd11SCVweGIQyBiZ+Yrq/BP82jt/haXhEt/1\nprdwxRXHufLgIZq6ZavapfMSJXOq2BAUibbiBRIw6tL6WK41WV/hnSPqQ2xtbTMatBjjUTokTZoL\nEAzBBoJr0Ti0zBFiTiZLnMsJdgmvesmWQisyDI/f/SXOnrsTKSNKpNT/9S/9eoIu0PLZuWHPWbA0\nzaLRJnqMyTG5QUhF09SEBa+3bi29PEeJlC6VRGxIBUyhzKIprCPLMpq6/SoI+nQ+BiWIDqq2Yjqe\nc/z4cbqmJdOablH9B9DG7B/99lpy94a1liwrsN5RFAW7kynBSIILxChoY8BKCcqkFT8mly4f9gSQ\ni6yUTDtCCAvPSPiq3RAWiuDF58MCWC2lQQiPi4lYIoVM3vTe4ZoGqZKFXCQVWiHZSuxJgkJI3xvj\ngrYfIgsBMmiNV4kr4BPNAReSE1vwLlFxIhRZRlZkhBCTO/SiMBlaT1SpwDcbT3jnO3+DwWCJ73zj\nm7nmmms5vHGA1gUu7IxZLpaYNg1d5/Ehoo3+KgNWIQ3eFZjcIQuB8M9jXt1Llp3CZGoBGpS0tqVx\ngtYrgkwsNYkmBkHVgotJilPokv5wnbNP3MtD99/CilDYYLE6Z/3ICWQ+IAaF+CvI4M9hUVKRZZKv\nf+U3cOOLX0ie9fj1X3sHyIxZ1aZJvABjC5Hk2J11SJURggMlyPMCt1DgNk2DdZcaeXo8Uihmdc1s\n3rC+sUFd13RNi9c6+TAujHFiCBRFkVQCMWKMTpPCpRaBrusgKuZ1xdbuGLdQMzvCopk9ketjTIjR\n/Z0iLly4QyTKNPn1RUck4GveVfY+ThKd1MvTKzKazqNJzl6CiBICHxMD2QePXiRHQggonT3dhw+4\nxeuMLmIXRUi78KE3RpFJiZES7xKLLQaPiiAjBOHT40phpMLkKd1vbYuUOhWA8YQIAc/u7ha/9V/f\nhVKa73nTm7jhxhdxaHUFH9OkO1dtI4XGB5lUzReNgEHnQ4R0KO0oGFFPK6oGyr4nLzxdmOMD2A68\nVXTW45yicSWTbo2g19D6ICbrU+YDdk89xEN33cpSptAuEkWkMz1e9A2vofYBJec8cveDzzpnnzMz\nIyGgszU33fRinv+Cq7n+hhN855u/De8agguJuOJJK3frUh++D/jo8USs9UwmU2xrEULSWXtRDSEN\nk+fUbcdkOmNtdX1h5BpRWqcs157l3MKn0VpL27b71nOQJqzzDuc9znlAUM2r/aIcJC8UAgTnFyA3\nkR5drPSI9MfZS/nsBcTXMr+9OFD2jmfOp+a3uukYLg2wziKUJAiBExGZ6X2H47TIpCMrJM1TCDHZ\nxvlA21jmVQ0iMhj06Pf7FEVJMRggM4MTApkbvILA4mIUA0qKxYce27V41yFEJM9NcuNSCte2eGuJ\n1hGjp7MNTVvxvvf/IT/1k/+WT3/y4/imYnVYcuWxKxgNc5RIBhgXD60zpDBIUYIoiTqjHF3F2sbX\nM5tucOpsoGoz6lbROkPjMjpfUMclxtUSjT+CNCcYDPuMRn06u8M9n/8QQ9kho8aKiFeS4eoRVg+c\nQKoeF86e5CtfvPVZ5+xzpzqWnkwbbvnwh7n2mrfig+fmm2/i/i/ezV13P0bdRpA+EfSFoG1bsizD\ntvP9k0uWZRiZs7m9g19AUS4eFzbnVLZjZbhGoQ1N1eC9J89zqrZD63SRV0qhFl2Zo9EIay1uL+Xq\nXarj+EDXWVAKFy5tBwjRJ4RqjCipiC4F4V7YOJGcrKSQ+8ewvUzfVwWMfNpgVQAhWhCgTYZ1FrzH\nSIHwDqkURpsE5Igx9fXkhizP8d7jnaRuWqbTKcPhgJXVAXKYUdcdzjfEaFNdxAX0bIosc7JMo4Qk\nK3OE97TzBhc9uQct91gDQIx47wgiCSqlzlB7wRoCkZQoiBEmVSpW3nLLn/He976PV33jq3jdt3w7\n63mflb5ia/vSdl6pCjwtUigEji645H6cjVg9OMSGHebVFuPxU+zOO3YbyW7V4eURynyDPFtCqREu\n5jRVwxc+9iesaIm1NVZ1KLXEN77+O+kfOk7tMqrZDnd++qP06r+lnZJ7KM3HHz3JbHfO8mUbZEpz\n000v54tffhjvPUoJlpaWkuxF61TlN4v0JlBXNZ1w6UhgO/JnXPBF5lnp9ymy7CJQg1w8twIWrbTB\nAwpjSmazOUVe4n26z+yZwOYmS5N7kZ0LMX6VISiL/++bC3Hp3edrIYn2+QIXP3bxexAJ2NE0DRvr\nq/vHQmChnVP7X5dlGc45Tp8+TZ7nFEWP4VKfojSXtGbHKNHa4D30e31ErFknUnnPZFaTFTn9rESb\nHCEMbTXHhZRRw3kMe5edSBAQnE3gOxUx2oAoabuW2nZEUherlJHpdEKeF9zx2U9x+2dv47prvo43\nfscbOLS0cunkCBqpDHVXkwuN1hkhBhoX6VyPsr/KwFyJC5czWCtZ7gTTeUcXM7wXBHIQOacvTPjS\nHR+mjDXW7SLzHrF3hH/4/f8TtetRNYHoGz73qfezmltOnd7m2cZzd2eRyaNkMm753f/2e/zjt/4Q\nwo5pmym9Xo71DbNqQtuk7E6ep0xF6+r9S7lRJvkFNi1Zr8DZS+kc83bKaHlAPZ6jVE5mDCH6xR9P\n4v1e/4Kg6yxSRLQ0yQFqce4PIewTUrSUtCGQfFBUEidyabCEkM7ue81oUkoWlUZgEVyLaBCRr8En\nvnjEfbZXU1dIuc54Z8zy8vKCk5bS63s1KCEEKMloZSV9HAJtW+3vguk96pQiDQKBoq5btMqY1RMO\nX3uCA4Xh9JlTeB8xUlKUPfr9Hp2bMz63RY5CKIGQkUxKfFwkIboWqdKRKkpJkacaT9W1ZCojehBK\n4l1NppO/5wP3f5m3PfQAhw9vwM1Pv+vDGyNOX9ii1yvTfUxlxODpgiPmksqnv4daPgLSUFoohoKq\nbvFREaLGBs3tf/QHiOokZZ5hsw1e/sq/x4nnvQoveqA68szy6T/7U9z5x9iptijM/4DUsRBiGXgn\ncANpWf9+4CH+Bs5fIsYkljOCk0+d5C9u/TiP3H8/99z/MBd2qnS3EJIYHLPpjCzL9zliZV6AFNTz\niuCTh2FVT76KKHhg+QCz8Zxe2SOGiPPtYmVd9J0LuaijpEmJFMQI1js66xGo/dXb+6Rrss7jg9x/\njq+5swhHJBVYESFlv/jqXeSi280lr3tvdwJBXPysZFuhKLMBs8mMsixxIZDnOWXZS8EpFlm9xc6n\npdrnFBiTpcBXLBCr6fglZFIOyINLPLVznkika1uMMEijCAGU0HiRM1hdY7q5RV+bVJKBRe+JxHiJ\nVEmsSghJ/SAkpTJ4EWl92tWyIicEl2iRSuKc5ezpS48/pfI87/jlzOuO87tTHFAFgQyKTGUEWiyC\n4DOE1EgkymQoafEIfND4AP/zW/85d972Sba2xtz0mm9BiBIR+8ggkbrki3d+gp0z9zIMMFeJePNs\n46+7s/wS8KEY45uFEBroAz/J38D5q2tbnE3Qg82tyG/95rsJIRIQSKMRIq3s/V4Pa+2isUsiF/cX\nKSVlWVLXNU3bIoyhe0ZbaPARowzW2sUxLuxXtJMMJE0W7zxC6cVOsrjDyFSQzLL0b8qOpZS2DyHV\nX77GUQsWgOnAvl/9xfzhi0e6/LP/NelSzz4IL316wSYTge2dMcZZDhxYB5Jjsc7SjpsKuiFB6Lzf\nf23GZPtylj3py17DV5HlKCJRJlSUkTlnz5whzwus90hrUSZNaES6eHspOD+bMRwOKQRIt1g0RGrg\nkyoVRwWR6BeejdKQ9TIa6+iaOv3+CQSfkiu2u1QA+zM/81N8z1vewvOf90KuPLjBrGmYVA3TpiOi\n8CER8BUiscAW71XJPiDJMkPjIsEG/s6rvh0fWhw51idoho2Bh+66g7u/9EkGqmXeOJTJvqrT9pnj\nr0PRHwGvijF+H2mCOGAshPgO4NWLL/st4OOkgNl3/gIeF0LsOX/ddvHz7h1RrHU43aO1C5aCDEhn\niSEFi9YCqQTFwmimtc3+ZN/rlqTuqDvPqacu9lCCqq6wXbfgYmVkZUGms5TmFBEl09FFqEiI6cLc\ntHUCbovUi9513X56WYinJ6LYP8aJ/SPQ3gRPqVizHwD7E/8ZQbV35wkXBd0+bjU9kJqTSDtNUWSs\n9Vbo9UpcqBGiwPmwf7n31qb3tHh+v8gO7mf8YlI+GGPQcrFjLrpO9WLnXt/YoK5qWt/QNhEtJJnW\n6BhTrSmAk5rNyZw8BtZ7A2Qh9lPhMQQMi1qOXOzcdEiVUxhNYRRNZyFEJm0FQaLVpRoG6zp+53d+\nG4HgO7/jTbz0ZS/nivUlmqg4P50xmXZkOgVJqtorpNAEUrofJ1DSIHolzlpQObGxZFmPzrY89dBd\n3PaRd1PoDh+7ROmn+JsHC3AC2BRCvAu4EbgT+Jc8u/PXxYHxNZ2/nHP7cpbOukTBlwKtDFqnImTS\ngSUSpbUOSEAJISVGKYSISSncNJw8s4N5hnNTv1fgFtXsuq6Zzecwn6eMWJbRywusdQvwRUbbthiz\nEHHGgBBP45asDbAfNIve9cVu9Myj2J5t+f7Os9g+9o5deyapf9nYP64B/2975x5r+VXV8c/ae/9+\nv/O4M/fOvTNTSh9UqBVINBQCMTwsihE0hliNRIkGCWriH2A0QSyGSOIfNSAhRmJI8BFoCMYgiTQq\nARKQWqQ+6NhCX7ahtNPOszP3eR6/3957+cfev3POvTOdXsvMvZCclZzc87rnt8757fXba33Xd60l\n5C4sJi3AGAOj0QBrlYBiXIHAZLcNISAkMqWHScyV9LUTow4ihDqAQJmJqa371+mWDDfGVJ00DTn6\nhqJjWV5e4vHvPoFoiVVPQ+DJrQ36I1joVQTfJN6aKjaXBxQuDRAa1VuocWn0n3UEa+j1+4wG9QWl\nFVtbG3S6Jc4Z/vnLd/K5z3+ON77hFt74pp/mcG+Bq5eOcvqZNdY8xCZ5B+Bw0kGc4CMYKRDfEGMq\n5yhLZRA2WT11ggf/9R855EaEYIjiMCkRxrMTj5LsJs/igFcCf6mqrwS2SDvIRJ7P5C/n3CSoTVdr\nSbyvqMQ6YfWlsWm0QPQ4ZwCPCxHjPbEeoqo89fRpTpx4ml6lXH/N9mnFxjgMidJeWEe/6lGagsqW\nEODp06dY3VhnazRkMBpleo3Ni73JsG0ACSgBHxrasQqzxqAwiROipt0nhAQkJEk0HoMFTXHANhfO\npJvK9h9K8k0lEuoxPQTrkgn5oDnBGFJziDo17Jg1iJZU2RItJSmKIbtdeTS4NoEYhNGwYbA1Bi0o\nKBmub+X5izVvu/UW3vPudyXOWRjjNeCjoKFg6IUz5wcM6kgjltpCkHQRanzAiFAVJSZKdtc8EpVC\nlcNLfZYObL/IFZVjY3MTP27wW1tUhXDXXV/lAx94P3d88hOsnn2KpR7cdM1Rrj96hH51gKLsg0ng\nhTMOQsSopXKKs6DxIPX5Ne6/+/P4zWdS8tZ4KmfplhXGCE095lKym53lOHBcVf8zP/4scBtw8nuZ\n/HXswYexxhJiZGVxkaMrywmlUib8p6ZpMGXqmj8YDBARbOGwRtgaRs6ffIazZ7c4fHSFQ8v9C0xy\nNB5hNBmjcw7NAT0kY10+dGiyqHzwrK0Nc6lyl6rqADJZ+M45fNMmIdmWJ7kQxZrKztdlkqncARe3\n78nZdiEH7ChIxFlHv98Dibm82k8Yx60blup9zMRtbGMT1cxEyIdrDT0CGhJ5snUl26YetizY2go8\nc+4cP/7qH+OpU+e447Y/QkPi6SUIPRk46hAnjBqP39qiQ0FvwWBdDzUVsUkdVLpFSd00DEcjjEvF\ne8NBPamabaVXFljtYDQyHg4RH7CuoigKvvWt+7j/vge4+uqr+Z3fejfdgwe5/sgB1gee85uWorAM\n64gaAxjEdqmDZ7B+km/c9UWOP/YgfZvYDpV1bA03Wd9cb7NHl5TdzGc5KSJP5iD9EdJMlm/n2zt4\nnpO/XnzdC1FJiFRVdCZw7jgzgXu93uTESU521XXDgX4HXxvWN4acX93iyNEVFpd6qI7RsH2jdKaT\najY0EENEXFrkbT6iwOBVUx2MEei6CbVlMDiPSHpvQsMC/X6fenVtEofMZvl3/Gg7HqaGEe0GbOw0\nD7PT4NJfzW6BmeZt8m7UNPVk9wAmAXtRFEiYGnAbN7WPnXNp126TrTmemcRUMnXXgkaCgf7SIlur\na/zb14/xzWMFMfdvgzYOSzw6LyBqE93Fp0rNl15/hGuOLPPV/3qA6KpEDA4N1kT6nZKxT+fCiaSp\n0TPSNZZut4ePAR8idYgMBwNM4bDWIKKcOXOK22//IAcXF3nHu36bQytHWFo+woZveCaM2ayFoB1Q\nQzMc8PB9d3H84XvoGI+LgooiqhzsVCx0jqCZonTy9CmeTXaLhr0b+LSIlMBjJOjY8j1M/qJIvmzU\ngA1hsoC7nVQhORhs5YYWNXUO0quqoomejc2aM2fXOHz0KpaXK8YjjzWdCziJdd2w0OtnNKwg0OSq\nSs1BruBCchOiKj6GSYZ/YeEAIcQU6+Sa/fXNAZ1uLy/06eLe+fW2P5dau7bPtUnGdpFua3ZHuxO1\nn99uQUKv14MIVVVty/+04r3fNuatfa11xZqmwTDVoa5r2ra4MUaitrU+Cfb1ViEopujgfc36sMHE\ngNEA2gIa099BFZSEcplOwdFF4U03H+Edb/89Pvkv9/O1f78HHw3jwQABOpXDB9Lgpbg9Zgl1ir8K\n57BFSaFgisDGxjpiIoXrAwVR4PQz5/jIn32ExUMH+I23/QrXvOQmDh5d5PTqiPVBzermkO8+/AD/\nffeXWTBDjAQK6WKco65HgNCMhwRN8dalZLcDWP8HePVFXnrek7/Wz53HGEtvoU+QgK9rep0uEj1l\nPy2I0WhAx1pGwwFVp0dRdTn+9AnOr57nqquO0F8oiN5jUARNfvyMCJaNzQGqSt0MwdRAIk+KCCpu\ngg6lmo8CnGXc1DR1SnAGiRw4tIQfNmBr6nqMUKKkdj0T5m2uIVFVUIvJC7FlHM9qpTGhU0Z0YniT\nHSTHLZp6rBILxapnZWmBsa8pKBAxGJ9OHc0AAA0jSURBVGmPpcSQjAKjGDs1AN8kXZo60jQ+I3Tt\n7mMwGd2THEfMQs7aSNZVqX1kHBv6zuYxgpbWTjRTnVUDzgBqcDbSLftoEPxojZfddA2/9psf5t6H\nTvJXH/s49fqQuvFoCHQtDHfylKwjqBJDRH0ynIOFo7d0kFGIaLTUdUMTElLZ6Qjrq+f4i7/+ON1O\nn1tv/WVuuOHFFIOac6dW+cYX/p5+2MQIWGPxcQR12weioSo7NMFsa9p4MdnHLvqHqeua4XBIKAp8\n3SAK/W6XKIJYQ9HppuE3BxY5t7rB+TPPsLa+wcrKClWnmwmUHmMso7Gf+OqttAswQc0Gn3MoIpm7\npUJVdVP1X9WjHo/TfEHnGNdp+GfZ6dL4mqIoKQtH3UA8u7ntGNMZKOkKafNC3kl72bmDzD4/gZ7N\n9LEqaPAUZUFpHczkSmahapMh2qIsaGqPl5inoIVtQMQUcEiI3tg3E5QsRvA+Tt4vGRkKITXMGHlh\nMGroli6hT5oMRaOCpstVwGAJVCZiCbhOn+Nn1hjUnjBc59WvejlvvPOzPHL/ffzJbR9k68wqSEG1\no6F7+5u236/tNtrUNWVVgRoK24HcLNFKmhs6HNd43/CpT/0tqkpZdjk3bKhMQGkSFSZKzp+V2/Je\nzwUbwz6yjkPj6VYdjhw+Qq/Xm/jWw/GYwbjm3PoGw7qh8ZHV9S0GozGnTp/m2muuZ/nQVQgFwUOt\nkSgObMGOjWVbMlDawrKQBqwGrzR1YDSsU+O3xmMVQu1RH6hsQceVGE1dU9qTl3YMsy3/AdsNs12A\ns7HIbIyyDUmbeWxMgsbb/xNJlJilhQMYZFIz3+ZNWqQLMlTsQ0baUjfO1tVqx3a0XLjgA3XwNDES\nFOqQmMmqZIOXiavXvr+0FcZYxiEwjoFowKsSDIkh3iJwUYhiKRaOsMYCT24IVdkjjEC9J6rwohte\nzB9/6EO89FWvZFOheRb3x1mHEUOn05nu3N5jiVQumXOnLHBi6FYlC/0uYmBcDwnB04xHHHTKgjMT\nuL/Nzc26yu1F7lJADexrw4p0xdgabGELx+JimmSlKjx1/GkOHFhMvYQr4dz5VZrGc+ONL6VpasZN\nqtcfjkZgzQSy1bCdoj8xFkmN5WxRYjMkJKoUVUWMgaJMV20Rhw9KURaTIHo4HNLvlwwGIzAOLwYr\nubyZNu5oyZ0zmfodO4aSi9aQyQJvk6vbELMZ42qHDlVVhWqkU5YMmzoHuSbFDEZoxskojLNE72nn\nsLTwdbtjNcES1KMhtyyNea6NKsELgmYwZfo7Ji6bEmolkGK7TlUyGAzpdDo0scE5M0HXBMWr5dHV\nmkNjxwtf9BIeefgxPvznn+CR754glAuM17eIGeSoVTFhO/MCYi6TTjVNMWYELqTGJGnhB0RDLrGW\nxOzwqSHh4kIPn11T9Q2bjUJmFhRFRQgpId32PUh95S4f3eWyS1VViQCIMBwM0wyVsqQoCq67/lrq\nsWc0ajh5ZpUQPFcdOcpwXCMmZbcH4xGuaEmBiuT6yllpYl6EChgLJs2M75V2EmuE6PGjcSJZhtQY\nwwXHgf4CvvGJP1Y39KuSaAxNo/Qqx8gr07EN0yAX0sQxl5ObPoQp2XIm+daiXhFNbWBz32FhaizW\nWlxRYsqCqDGzeEMa9KOgakAsMQc6qYlDpK7T4htlt6qp06SuoFPGgYlKyAwJABslcd8yZcjm4Uam\nLQZzBquptr1pIsaVjGpPEwKOBGcbIt4HOt0l7nngcb709f/AWkNRHsIVJrW+sgaKTmINqiaGxo4L\nuheTjSNmur+B3M7J5AtUwuJykjqfg661qXuLNTQaoVCkKLCW1NrKx0lcYvOguACTXthcemPZ/7Li\nEFKte6fTIYTAuPG4okOQyPrWEFcWLB9cScVOIblBIppo2zFizJT3tFNaWNWY6TAk55KhpLkuqdFD\n6loCMRrKopez5KNJLXsIAc30XxFhcXmZ0xsnErAgszGGbrvNQssiktBgSVC1sp3RnP5fJrmT9vO6\nRYVEZeRrCrHE/P6oIZXBZhRPQyTkC0bLQGh8NlJjMyKXdkBVUozSTHM+knfDJmR3UhWRIjXty8m+\nKcEzGfvNN9/M3XffTae7gMaQaPqqPHF2lbJ0uPIQxloiSpMX6mhcp9imSAlo33hM3L5Ky/5BjEaG\n4w26IdUIRY15F0kwv4bUFWc2sW1camYSfIPBUlhHbBLR0+KQskJRvG8yH9Dg6zA5B3rJvPq+lhWn\nL1iWJWKnJ8BgaULk3Oo6BxYPgYAtXHJyRAhNxBoLGnNgmq6Es8F0K1NOV/r8cV3jjKCkfEGqrXco\naT68swXt6rHWMBgMpt1SRIjGMK4DZVFMGl4AF7hS7YKa9YUT+hZSXKBTt2M2sJTsqrX+eQyRxYWD\naIzpN2pPrqYeYr72tIGaqlLHVF/S7krNLBKnIBnyRRPjwLkqw7/K0sISq+trWFvhNWAiWGGSeGy/\nw2xN0LFjxzITA4Imyr8rpnX/RIPgqHWAKwrqpsYEBSxKjTUGY9wFFa7DJmBiJNa5L3PdUJRlUggS\npD3zu00vRnVqoiEWTDpKu1tYUpVtIq2GSbReOEfjfc6DfZ/W4LeJssToLTGSBuhEhadPnWI0qjm0\nuEKjIeVAWu4S+eTENOs9asx5iQg7fM6vveXrz63Id0jst/+vvPl5/M9zyUV0OcmTV+BAF8qju9Bl\nr6QuOzRxTG9oEOtY2xrxAlegsUFLzeBGQerPGClcCRppmmwbJjElLAIGmpAK0yQGQvAUYmlSFxBE\nEzMaK/hwaT9s39CwWYkhuUEiwokTp1k+tMKP3HQToBSmIITkdllbIMbkOIDUt9GWGFsixmKeI6l0\nUXn8Mn+Z70Ue328FZuTx/Tv0W9/5dm7/2EexZYGK4fxgiBUwCGWALoLzDQ5Nc0hD6vustgJb4Sgp\ncZRS4iQ12NCM9rXupBiIoaFAU8w1HlE+h177N9o7J8CUhJGjgTNnzlJV3VQjPxgwocL7SOowJZOm\ndxO/PkIkJdkunVKayw+K/PDrXkHn4CJNU2NCmKBWpbGYCCYH8FYUsUJVlQxHQ4JRTDBYk4wkmBqN\nEGJys0OIU3RSI+ARk1sgaXL9LiX7Ch2HELDOAoaTJ89w4403srq2NUmexRhxUqGaWMmQdp8WvXDO\n0TQhZWVDIITIa7/wuonPHIJOWMKgJETUbJsEdvypJ3hR+KFJ18bxeJxbIg1QnaInpXOTgD09Z7n/\ngQeRmaxvGzOVRedCSJgZHpi5cPipiLC1uU7/sRWCbzjY77F4cIHQpN+irRfxwWSXNGb0bZrXCZrQ\nqTauKGQKFEzZ3Rl8yNTBNmi3YRakiKyun2XliZXJZxeFncRgKVabAhujYZMheDOZpyOSmBLGGIpI\natfadfziO36dhx55gFt/9Ze47tqr2RoOOby4TKfb5czWGuveE2zg7JlT9IoSIVLk5ubRpIIFDIh1\nFBryGnEs9TusOGGkghfHwEdCtHiJeKOJMhOnuqmGCedPNaJWqcP2svSdIhejbV1pEZG9P+hc5rJL\n0Wfp47ovxjKXufwgyvdFgD+XufwgyNxY5jKXXcqeG4uIvEVEHhKR/5XUFeZKH+9vROSUiNw/89yy\niHxJRB4RkS9KavXUvnZb1u0hEfmZy6jHdSLyFRH5toh8S0Tes4+6dETkHhE5JiIPiMjt+6XLzOdb\nEblXRO7cb12eVXZSNK7kjVQw9ihwA1AAx4CXXeFjvgG4Gbh/5rkPAX+Q778P+NN8/+VZpyLr+Chg\nLpMeLwBeke8vAA8DL9sPXfLn9/JfR2ow8vr90iUf4/eBTwOf369z9Fy3vd5ZXgM8qqqPa2qV9Hek\n1klXTFT1LuD8jqffSmrfRP77C/n+pI2Tqj5OOhGvuUx6nFTVY/n+JvAgqex6z3XJOgzy3ZJ0ETu/\nX7qIyLXAz5EaObZI1L7ocinZa2O5BrbxNy7aJmkP5FJtnGabj10R/UTkBtJud89+6SIiRkSO5WN+\nRVW/vV+6AB8F3gvbiOP7eo4uJnttLN93OLWmvf1Sel1WnUVkAfgH4HdVdWP2tb3URVWjqr6C1H3n\nJ0TkJ/dDFxH5eeC0qt7Ls5Dk9/ocPZvstbHsbJN0HduvEnslp0TkBQDyPNo4PV8RkYJkKHeoatsN\nZ190aUVV14B/Al61T7q8FniriHwH+AzwUyJyxz7pcmnZi8BoJohzpO4wN5B85Sse4Ofj3sCFAf77\n8v0/5MLgsSRxbh8jJ24vgw4CfAr46I7n90OXw8BSvt8Fvga8aT902aHXLcCd+/W7PKd+e3GQHT/I\nz5KQoEeB2/bgeJ8BngZqUrz0TmAZ+DLwCPDFduHk978/6/YQ8ObLqMfrST75MeDefHvLPunyo8A3\nsy73Ae/Nz++5Ljv0uoUpGravulzsNqe7zGUuu5R5Bn8uc9mlzI1lLnPZpcyNZS5z2aXMjWUuc9ml\nzI1lLnPZpcyNZS5z2aXMjWUuc9mlzI1lLnPZpfwfVKttii8aM/YAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# or show the bounding box of the referred object\n", + "refer.showRef(ref, seg_box='box')\n", + "plt.show()" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "sent_id[64727]: woman in front\n", + "sent_id[64728]: lady smiling\n", + "sent_id[64729]: woman\n" + ] + } + ], + "source": [ + "# let's look at the details of each ref\n", + "for sent in ref['sentences']:\n", + " print 'sent_id[%s]: %s' % (sent['sent_id'], sent['sent'])" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 2", + "language": "python", + "name": "python2" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 2 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython2", + "version": "2.7.6" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/tools/refer/refer.py b/tools/refer/refer.py new file mode 100644 index 0000000..75cb316 --- /dev/null +++ b/tools/refer/refer.py @@ -0,0 +1,359 @@ +__author__ = 'licheng' + +""" +This interface provides access to four datasets: +1) refclef +2) refcoco +3) refcoco+ +4) refcocog +split by unc and google + +The following API functions are defined: +REFER - REFER api class +getRefIds - get ref ids that satisfy given filter conditions. +getAnnIds - get ann ids that satisfy given filter conditions. +getImgIds - get image ids that satisfy given filter conditions. +getCatIds - get category ids that satisfy given filter conditions. +loadRefs - load refs with the specified ref ids. +loadAnns - load anns with the specified ann ids. +loadImgs - load images with the specified image ids. +loadCats - load category names with the specified category ids. +getRefBox - get ref's bounding box [x, y, w, h] given the ref_id +showRef - show image, segmentation or box of the referred object with the ref +getMask - get mask and area of the referred object given ref +showMask - show mask of the referred object given ref +""" + +import sys +import os.path as osp +import json +# import cPickle as pickle +import _pickle as pickle + +import time +import itertools +# import skimage.io as io +import matplotlib.pyplot as plt +from matplotlib.collections import PatchCollection +from matplotlib.patches import Polygon, Rectangle +from pprint import pprint +import numpy as np +from .external import mask +# import cv2 +# from skimage.measure import label, regionprops + +class REFER: + + def __init__(self, data_root, dataset='refcoco', splitBy='unc'): + # provide data_root folder which contains refclef, refcoco, refcoco+ and refcocog + # also provide dataset name and splitBy information + # e.g., dataset = 'refcoco', splitBy = 'unc' + print('loading dataset %s into memory...' % dataset) + self.ROOT_DIR = osp.abspath(osp.dirname(__file__)) + self.DATA_DIR = osp.join(data_root, dataset) + if dataset in ['refcoco', 'refcoco+', 'refcocog']: + self.IMAGE_DIR = osp.join(data_root, 'images/mscoco/images/train2014') + elif dataset == 'refclef': + self.IMAGE_DIR = osp.join(data_root, 'images/saiapr_tc-12') + else: + print('No refer dataset is called [%s]' % dataset) + sys.exit() + + # load refs from data/dataset/refs(dataset).json + tic = time.time() + ref_file = osp.join(self.DATA_DIR, 'refs('+splitBy+').p') + self.data = {} + self.data['dataset'] = dataset + self.data['refs'] = pickle.load(open(ref_file, 'rb')) + + # load annotations from data/dataset/instances.json + instances_file = osp.join(self.DATA_DIR, 'instances.json') + instances = json.load(open(instances_file, 'r')) + self.data['images'] = instances['images'] + self.data['annotations'] = instances['annotations'] + self.data['categories'] = instances['categories'] + + # create index + self.createIndex() + print('DONE (t=%.2fs)' % (time.time()-tic)) + + def createIndex(self): + # create sets of mapping + # 1) Refs: {ref_id: ref} + # 2) Anns: {ann_id: ann} + # 3) Imgs: {image_id: image} + # 4) Cats: {category_id: category_name} + # 5) Sents: {sent_id: sent} + # 6) imgToRefs: {image_id: refs} + # 7) imgToAnns: {image_id: anns} + # 8) refToAnn: {ref_id: ann} + # 9) annToRef: {ann_id: ref} + # 10) catToRefs: {category_id: refs} + # 11) sentToRef: {sent_id: ref} + # 12) sentToTokens: {sent_id: tokens} + print('creating index...') + # fetch info from instances + Anns, Imgs, Cats, imgToAnns = {}, {}, {}, {} + for ann in self.data['annotations']: + Anns[ann['id']] = ann + imgToAnns[ann['image_id']] = imgToAnns.get(ann['image_id'], []) + [ann] + for img in self.data['images']: + Imgs[img['id']] = img + for cat in self.data['categories']: + Cats[cat['id']] = cat['name'] + + # fetch info from refs + Refs, imgToRefs, refToAnn, annToRef, catToRefs = {}, {}, {}, {}, {} + Sents, sentToRef, sentToTokens = {}, {}, {} + for ref in self.data['refs']: + # ids + ref_id = ref['ref_id'] + ann_id = ref['ann_id'] + category_id = ref['category_id'] + image_id = ref['image_id'] + + # add mapping related to ref + Refs[ref_id] = ref + imgToRefs[image_id] = imgToRefs.get(image_id, []) + [ref] + catToRefs[category_id] = catToRefs.get(category_id, []) + [ref] + refToAnn[ref_id] = Anns[ann_id] + annToRef[ann_id] = ref + + # add mapping of sent + for sent in ref['sentences']: + Sents[sent['sent_id']] = sent + sentToRef[sent['sent_id']] = ref + sentToTokens[sent['sent_id']] = sent['tokens'] + + # create class members + self.Refs = Refs + self.Anns = Anns + self.Imgs = Imgs + self.Cats = Cats + self.Sents = Sents + self.imgToRefs = imgToRefs + self.imgToAnns = imgToAnns + self.refToAnn = refToAnn + self.annToRef = annToRef + self.catToRefs = catToRefs + self.sentToRef = sentToRef + self.sentToTokens = sentToTokens + print('index created.') + + def getRefIds(self, image_ids=[], cat_ids=[], ref_ids=[], split=''): + image_ids = image_ids if type(image_ids) == list else [image_ids] + cat_ids = cat_ids if type(cat_ids) == list else [cat_ids] + ref_ids = ref_ids if type(ref_ids) == list else [ref_ids] + + if len(image_ids)==len(cat_ids)==len(ref_ids)==len(split)==0: + refs = self.data['refs'] + else: + if not len(image_ids) == 0: + refs = [self.imgToRefs[image_id] for image_id in image_ids] + else: + refs = self.data['refs'] + if not len(cat_ids) == 0: + refs = [ref for ref in refs if ref['category_id'] in cat_ids] + if not len(ref_ids) == 0: + refs = [ref for ref in refs if ref['ref_id'] in ref_ids] + if not len(split) == 0: + if split in ['testA', 'testB', 'testC']: + refs = [ref for ref in refs if split[-1] in ref['split']] # we also consider testAB, testBC, ... + elif split in ['testAB', 'testBC', 'testAC']: + refs = [ref for ref in refs if ref['split'] == split] # rarely used I guess... + elif split == 'test': + refs = [ref for ref in refs if 'test' in ref['split']] + elif split == 'train' or split == 'val': + refs = [ref for ref in refs if ref['split'] == split] + else: + print('No such split [%s]' % split) + sys.exit() + ref_ids = [ref['ref_id'] for ref in refs] + return ref_ids + + def getAnnIds(self, image_ids=[], cat_ids=[], ref_ids=[]): + image_ids = image_ids if type(image_ids) == list else [image_ids] + cat_ids = cat_ids if type(cat_ids) == list else [cat_ids] + ref_ids = ref_ids if type(ref_ids) == list else [ref_ids] + + if len(image_ids) == len(cat_ids) == len(ref_ids) == 0: + ann_ids = [ann['id'] for ann in self.data['annotations']] + else: + if not len(image_ids) == 0: + lists = [self.imgToAnns[image_id] for image_id in image_ids if image_id in self.imgToAnns] # list of [anns] + anns = list(itertools.chain.from_iterable(lists)) + else: + anns = self.data['annotations'] + if not len(cat_ids) == 0: + anns = [ann for ann in anns if ann['category_id'] in cat_ids] + ann_ids = [ann['id'] for ann in anns] + if not len(ref_ids) == 0: + ids = set(ann_ids).intersection(set([self.Refs[ref_id]['ann_id'] for ref_id in ref_ids])) + return ann_ids + + def getImgIds(self, ref_ids=[]): + ref_ids = ref_ids if type(ref_ids) == list else [ref_ids] + + if not len(ref_ids) == 0: + image_ids = list(set([self.Refs[ref_id]['image_id'] for ref_id in ref_ids])) + else: + image_ids = self.Imgs.keys() + return image_ids + + def getCatIds(self): + return self.Cats.keys() + + def loadRefs(self, ref_ids=[]): + if type(ref_ids) == list: + return [self.Refs[ref_id] for ref_id in ref_ids] + elif type(ref_ids) == int: + return [self.Refs[ref_ids]] + + def loadAnns(self, ann_ids=[]): + if type(ann_ids) == list: + return [self.Anns[ann_id] for ann_id in ann_ids] + elif type(ann_ids) == int or type(ann_ids) == unicode: + return [self.Anns[ann_ids]] + + def loadImgs(self, image_ids=[]): + if type(image_ids) == list: + return [self.Imgs[image_id] for image_id in image_ids] + elif type(image_ids) == int: + return [self.Imgs[image_ids]] + + def loadCats(self, cat_ids=[]): + if type(cat_ids) == list: + return [self.Cats[cat_id] for cat_id in cat_ids] + elif type(cat_ids) == int: + return [self.Cats[cat_ids]] + + def getRefBox(self, ref_id): + ref = self.Refs[ref_id] + ann = self.refToAnn[ref_id] + return ann['bbox'] # [x, y, w, h] + + def showRef(self, ref, seg_box='seg'): + ax = plt.gca() + # show image + image = self.Imgs[ref['image_id']] + I = io.imread(osp.join(self.IMAGE_DIR, image['file_name'])) + ax.imshow(I) + # show refer expression + for sid, sent in enumerate(ref['sentences']): + print('%s. %s' % (sid+1, sent['sent'])) + # show segmentations + if seg_box == 'seg': + ann_id = ref['ann_id'] + ann = self.Anns[ann_id] + polygons = [] + color = [] + c = 'none' + if type(ann['segmentation'][0]) == list: + # polygon used for refcoco* + for seg in ann['segmentation']: + poly = np.array(seg).reshape((len(seg)/2, 2)) + polygons.append(Polygon(poly, True, alpha=0.4)) + color.append(c) + p = PatchCollection(polygons, facecolors=color, edgecolors=(1,1,0,0), linewidths=3, alpha=1) + ax.add_collection(p) # thick yellow polygon + p = PatchCollection(polygons, facecolors=color, edgecolors=(1,0,0,0), linewidths=1, alpha=1) + ax.add_collection(p) # thin red polygon + else: + # mask used for refclef + rle = ann['segmentation'] + m = mask.decode(rle) + img = np.ones( (m.shape[0], m.shape[1], 3) ) + color_mask = np.array([2.0,166.0,101.0])/255 + for i in range(3): + img[:,:,i] = color_mask[i] + ax.imshow(np.dstack( (img, m*0.5) )) + # show bounding-box + elif seg_box == 'box': + ann_id = ref['ann_id'] + ann = self.Anns[ann_id] + bbox = self.getRefBox(ref['ref_id']) + box_plot = Rectangle((bbox[0], bbox[1]), bbox[2], bbox[3], fill=False, edgecolor='green', linewidth=3) + ax.add_patch(box_plot) + + def getMask(self, ref): + # return mask, area and mask-center + ann = self.refToAnn[ref['ref_id']] + image = self.Imgs[ref['image_id']] + if type(ann['segmentation'][0]) == list: # polygon + rle = mask.frPyObjects(ann['segmentation'], image['height'], image['width']) + else: + rle = ann['segmentation'] + m = mask.decode(rle) + m = np.sum(m, axis=2) # sometimes there are multiple binary map (corresponding to multiple segs) + m = m.astype(np.uint8) # convert to np.uint8 + # compute area + area = sum(mask.area(rle)) # should be close to ann['area'] + return {'mask': m, 'area': area} + # # position + # position_x = np.mean(np.where(m==1)[1]) # [1] means columns (matlab style) -> x (c style) + # position_y = np.mean(np.where(m==1)[0]) # [0] means rows (matlab style) -> y (c style) + # # mass position (if there were multiple regions, we use the largest one.) + # label_m = label(m, connectivity=m.ndim) + # regions = regionprops(label_m) + # if len(regions) > 0: + # largest_id = np.argmax(np.array([props.filled_area for props in regions])) + # largest_props = regions[largest_id] + # mass_y, mass_x = largest_props.centroid + # else: + # mass_x, mass_y = position_x, position_y + # # if centroid is not in mask, we find the closest point to it from mask + # if m[mass_y, mass_x] != 1: + # print 'Finding closes mask point ...' + # kernel = np.ones((10, 10),np.uint8) + # me = cv2.erode(m, kernel, iterations = 1) + # points = zip(np.where(me == 1)[0].tolist(), np.where(me == 1)[1].tolist()) # row, col style + # points = np.array(points) + # dist = np.sum((points - (mass_y, mass_x))**2, axis=1) + # id = np.argsort(dist)[0] + # mass_y, mass_x = points[id] + # # return + # return {'mask': m, 'area': area, 'position_x': position_x, 'position_y': position_y, 'mass_x': mass_x, 'mass_y': mass_y} + # # show image and mask + # I = io.imread(osp.join(self.IMAGE_DIR, image['file_name'])) + # plt.figure() + # plt.imshow(I) + # ax = plt.gca() + # img = np.ones( (m.shape[0], m.shape[1], 3) ) + # color_mask = np.array([2.0,166.0,101.0])/255 + # for i in range(3): + # img[:,:,i] = color_mask[i] + # ax.imshow(np.dstack( (img, m*0.5) )) + # plt.show() + + def showMask(self, ref): + M = self.getMask(ref) + msk = M['mask'] + ax = plt.gca() + ax.imshow(msk) + + +if __name__ == '__main__': + refer = REFER(dataset='refcocog', splitBy='google') + ref_ids = refer.getRefIds() + print(len(ref_ids)) + + print(len(refer.Imgs)) + print(len(refer.imgToRefs)) + + ref_ids = refer.getRefIds(split='train') + print('There are %s training referred objects.' % len(ref_ids)) + + for ref_id in ref_ids: + ref = refer.loadRefs(ref_id)[0] + if len(ref['sentences']) < 2: + continue + + pprint(ref) + print('The label is %s.' % refer.Cats[ref['category_id']]) + plt.figure() + refer.showRef(ref, seg_box='box') + plt.show() + + # plt.figure() + # refer.showMask(ref) + # plt.show() diff --git a/tools/refer/setup.py b/tools/refer/setup.py new file mode 100644 index 0000000..e055900 --- /dev/null +++ b/tools/refer/setup.py @@ -0,0 +1,25 @@ +""" +This code is for making mask.so, used to visualize the segmentation of referred object. +All "mask" related code is copied from https://github.com/pdollar/coco.git +""" +from distutils.core import setup +from Cython.Build import cythonize +from distutils.extension import Extension +import numpy as np + +ext_modules = [ + Extension( + 'external._mask', + sources=['external/maskApi.c', 'external/_mask.pyx'], + include_dirs = [np.get_include(), 'external'], + extra_compile_args=['-Wno-cpp', '-Wno-unused-function', '-std=c99'], + ) + ] + +setup( + name='external', + packages=['external'], + package_dir = {'external': 'external'}, + version='2.0', + ext_modules=cythonize(ext_modules) + ) diff --git a/tools/refer/test/sample_expressions_testA.json b/tools/refer/test/sample_expressions_testA.json new file mode 100644 index 0000000..275e3a3 --- /dev/null +++ b/tools/refer/test/sample_expressions_testA.json @@ -0,0 +1 @@ +{"predictions":[{"sent":"man in black","ref_id":47},{"sent":"person on right","ref_id":109},{"sent":"woman in red","ref_id":110},{"sent":"car behind bike","ref_id":111},{"sent":"car on left","ref_id":112},{"sent":"man in blue","ref_id":382},{"sent":"man in white","ref_id":383},{"sent":"left person","ref_id":519},{"sent":"man on right","ref_id":520},{"sent":"person in background","ref_id":525},{"sent":"person on left","ref_id":526},{"sent":"man in white","ref_id":527},{"sent":"guy in white","ref_id":528},{"sent":"guy in red","ref_id":537},{"sent":"white shirt","ref_id":538},{"sent":"player in white","ref_id":539},{"sent":"red shirt","ref_id":557},{"sent":"girl","ref_id":558},{"sent":"baby","ref_id":588},{"sent":"baby","ref_id":589},{"sent":"woman in front","ref_id":640},{"sent":"girl","ref_id":641},{"sent":"right guy","ref_id":732},{"sent":"man in white","ref_id":733},{"sent":"middle guy","ref_id":734},{"sent":"woman","ref_id":756},{"sent":"man on right","ref_id":757},{"sent":"woman","ref_id":814},{"sent":"man in white","ref_id":815},{"sent":"man in white shirt","ref_id":828},{"sent":"woman on right","ref_id":829},{"sent":"man in red","ref_id":931},{"sent":"woman in pink","ref_id":932},{"sent":"girl in pink","ref_id":933},{"sent":"middle guy","ref_id":945},{"sent":"second from right","ref_id":946},{"sent":"left guy","ref_id":947},{"sent":"white jacket","ref_id":954},{"sent":"right guy","ref_id":955},{"sent":"blue jacket","ref_id":956},{"sent":"man in white shirt","ref_id":1023},{"sent":"man","ref_id":1024},{"sent":"man in back","ref_id":1052},{"sent":"left guy","ref_id":1053},{"sent":"woman on right","ref_id":1152},{"sent":"woman on right","ref_id":1153},{"sent":"left guy","ref_id":1154},{"sent":"woman on right","ref_id":1333},{"sent":"man in black shirt","ref_id":1334},{"sent":"man","ref_id":1362},{"sent":"man","ref_id":1363},{"sent":"right guy","ref_id":1371},{"sent":"left guy","ref_id":1372},{"sent":"man in front","ref_id":1406},{"sent":"man on left","ref_id":1407},{"sent":"person on right","ref_id":1568},{"sent":"person in front","ref_id":1569},{"sent":"man in black","ref_id":1582},{"sent":"man in front","ref_id":1583},{"sent":"right skier","ref_id":1623},{"sent":"person in front","ref_id":1624},{"sent":"second from left","ref_id":1679},{"sent":"man on left","ref_id":1680},{"sent":"second from right","ref_id":1681},{"sent":"left guy","ref_id":1682},{"sent":"woman on right","ref_id":1683},{"sent":"girl on right","ref_id":1684},{"sent":"man on right","ref_id":1811},{"sent":"man in front of man in white shirt","ref_id":1812},{"sent":"woman in white shirt","ref_id":1861},{"sent":"man in black","ref_id":1862},{"sent":"groom","ref_id":1882},{"sent":"bride","ref_id":1883},{"sent":"middle guy","ref_id":1977},{"sent":"left guy","ref_id":1978},{"sent":"right guy","ref_id":1979},{"sent":"second from left","ref_id":1980},{"sent":"person on left","ref_id":1990},{"sent":"left person","ref_id":1991},{"sent":"player","ref_id":2001},{"sent":"top left corner","ref_id":2002},{"sent":"girl in white on left","ref_id":2129},{"sent":"white shirt","ref_id":2130},{"sent":"woman in white","ref_id":2131},{"sent":"red jacket","ref_id":2173},{"sent":"red","ref_id":2174},{"sent":"catcher","ref_id":2256},{"sent":"umpire","ref_id":2257},{"sent":"baby","ref_id":2264},{"sent":"man","ref_id":2265},{"sent":"boy in blue","ref_id":2291},{"sent":"boy in red","ref_id":2292},{"sent":"man in black","ref_id":2375},{"sent":"man in black","ref_id":2376},{"sent":"blue jacket","ref_id":2721},{"sent":"bottom left","ref_id":2722},{"sent":"man","ref_id":2767},{"sent":"man","ref_id":2768},{"sent":"batter","ref_id":2805},{"sent":"right guy","ref_id":2806},{"sent":"batter","ref_id":2807},{"sent":"woman in black","ref_id":2981},{"sent":"woman in white","ref_id":2982},{"sent":"left girl","ref_id":3247},{"sent":"man in white","ref_id":3248},{"sent":"man on left","ref_id":3257},{"sent":"woman in middle","ref_id":3258},{"sent":"woman on right","ref_id":3259},{"sent":"man in middle","ref_id":3260},{"sent":"guy on right","ref_id":3366},{"sent":"left person","ref_id":3367},{"sent":"girl in pink","ref_id":3768},{"sent":"girl in pink","ref_id":3769},{"sent":"right guy","ref_id":3772},{"sent":"man","ref_id":3773},{"sent":"man in blue shirt","ref_id":3805},{"sent":"person in blue shirt","ref_id":3806},{"sent":"man in black","ref_id":3807},{"sent":"guy in red","ref_id":4002},{"sent":"second horse from left","ref_id":4003},{"sent":"guy in blue shirt","ref_id":4014},{"sent":"man in blue shirt","ref_id":4015},{"sent":"left person","ref_id":4016},{"sent":"man in blue","ref_id":4017},{"sent":"girl on right","ref_id":4089},{"sent":"girl","ref_id":4090},{"sent":"woman","ref_id":4101},{"sent":"girl","ref_id":4102},{"sent":"woman in black","ref_id":4143},{"sent":"person sitting on left","ref_id":4144},{"sent":"man in black","ref_id":4145},{"sent":"white shirt","ref_id":4159},{"sent":"man on right","ref_id":4160},{"sent":"right girl","ref_id":4174},{"sent":"left girl","ref_id":4175},{"sent":"person on right","ref_id":4176},{"sent":"girl on left","ref_id":4177},{"sent":"woman in red","ref_id":4178},{"sent":"bride","ref_id":4187},{"sent":"right kid","ref_id":4188},{"sent":"person on left","ref_id":4190},{"sent":"woman in black","ref_id":4191},{"sent":"woman in white","ref_id":4192},{"sent":"left person","ref_id":4308},{"sent":"person on left","ref_id":4309},{"sent":"man on left","ref_id":4333},{"sent":"man in blue shirt","ref_id":4334},{"sent":"batter","ref_id":4345},{"sent":"catcher","ref_id":4346},{"sent":"umpire","ref_id":4347},{"sent":"man on right","ref_id":4396},{"sent":"right person","ref_id":4397},{"sent":"woman in blue","ref_id":4461},{"sent":"man in black","ref_id":4462},{"sent":"man on left","ref_id":4463},{"sent":"woman on right","ref_id":4485},{"sent":"woman","ref_id":4486},{"sent":"right horse","ref_id":4487},{"sent":"right horse","ref_id":4488},{"sent":"left horse","ref_id":4489},{"sent":"woman","ref_id":4578},{"sent":"woman","ref_id":4579},{"sent":"left bottom corner","ref_id":4580},{"sent":"woman in blue","ref_id":4581},{"sent":"head bottom right","ref_id":4582},{"sent":"woman","ref_id":4616},{"sent":"man","ref_id":4617},{"sent":"woman in black","ref_id":4711},{"sent":"left horse","ref_id":4712},{"sent":"umpire","ref_id":4765},{"sent":"catcher","ref_id":4766},{"sent":"man on right","ref_id":4868},{"sent":"man on right","ref_id":4869},{"sent":"man","ref_id":4870},{"sent":"white shirt","ref_id":5012},{"sent":"right guy","ref_id":5118},{"sent":"man in white","ref_id":5119},{"sent":"woman","ref_id":5149},{"sent":"man","ref_id":5150},{"sent":"right dog","ref_id":5170},{"sent":"left dog","ref_id":5171},{"sent":"woman","ref_id":5244},{"sent":"woman","ref_id":5245},{"sent":"woman in black","ref_id":5289},{"sent":"woman in black","ref_id":5290},{"sent":"man on right","ref_id":5291},{"sent":"person on left","ref_id":5292},{"sent":"woman in black","ref_id":5293},{"sent":"person on right","ref_id":5309},{"sent":"woman in red","ref_id":5310},{"sent":"girl","ref_id":5311},{"sent":"person on right","ref_id":5389},{"sent":"man","ref_id":5390},{"sent":"man","ref_id":5550},{"sent":"woman on right","ref_id":5551},{"sent":"man on left","ref_id":5552},{"sent":"man in white","ref_id":5615},{"sent":"woman in red","ref_id":5648},{"sent":"woman in blue","ref_id":5649},{"sent":"woman in black","ref_id":5650},{"sent":"catcher","ref_id":5767},{"sent":"batter","ref_id":5768},{"sent":"umpire","ref_id":5769},{"sent":"couch on left","ref_id":5776},{"sent":"couch on right","ref_id":5777},{"sent":"guy in red","ref_id":5782},{"sent":"red shirt","ref_id":5783},{"sent":"batter","ref_id":5784},{"sent":"girl on left","ref_id":5811},{"sent":"woman","ref_id":5812},{"sent":"bowl of white stuff on left","ref_id":5923},{"sent":"bowl of rice in the back","ref_id":5924},{"sent":"top right glass","ref_id":5925},{"sent":"top left corner","ref_id":5926},{"sent":"skier in middle","ref_id":6042},{"sent":"left skier","ref_id":6043},{"sent":"guy in white shirt","ref_id":6073},{"sent":"man","ref_id":6074},{"sent":"man in black","ref_id":6081},{"sent":"guy in white","ref_id":6082},{"sent":"umpire","ref_id":6118},{"sent":"batter","ref_id":6119},{"sent":"right guy","ref_id":6210},{"sent":"guy on right","ref_id":6211},{"sent":"kid","ref_id":6237},{"sent":"kid","ref_id":6238},{"sent":"baby","ref_id":6239},{"sent":"woman","ref_id":6257},{"sent":"person on right","ref_id":6461},{"sent":"woman","ref_id":6462},{"sent":"hand on left","ref_id":6607},{"sent":"left kid","ref_id":6698},{"sent":"player in red","ref_id":6699},{"sent":"player","ref_id":6799},{"sent":"guy in white","ref_id":6800},{"sent":"right person","ref_id":6809},{"sent":"guy in red","ref_id":6810},{"sent":"catcher","ref_id":6811},{"sent":"catcher","ref_id":6812},{"sent":"guy in white shirt","ref_id":6956},{"sent":"girl in blue","ref_id":6957},{"sent":"red shirt","ref_id":6958},{"sent":"bottom left corner","ref_id":7045},{"sent":"girl in black","ref_id":7046},{"sent":"girl","ref_id":7047},{"sent":"batter","ref_id":7170},{"sent":"batter","ref_id":7171},{"sent":"catcher","ref_id":7172},{"sent":"hand","ref_id":7211},{"sent":"the girl","ref_id":7212},{"sent":"umpire","ref_id":7249},{"sent":"catcher","ref_id":7250},{"sent":"catcher","ref_id":7251},{"sent":"man in white shirt","ref_id":7377},{"sent":"man on right","ref_id":7378},{"sent":"woman in white","ref_id":7379},{"sent":"girl","ref_id":7477},{"sent":"girl","ref_id":7478},{"sent":"man in red shirt","ref_id":7496},{"sent":"guy in red shirt","ref_id":7497},{"sent":"red shirt","ref_id":7498},{"sent":"red shirt","ref_id":7499},{"sent":"woman in blue","ref_id":7568},{"sent":"woman in white","ref_id":7569},{"sent":"man on left","ref_id":7570},{"sent":"woman on left","ref_id":7571},{"sent":"woman on right","ref_id":7572},{"sent":"man in blue shirt","ref_id":7573},{"sent":"man in blue","ref_id":7574},{"sent":"person in front of man in white shirt","ref_id":7575},{"sent":"chair on left","ref_id":7576},{"sent":"girl in white","ref_id":7608},{"sent":"left guy","ref_id":7609},{"sent":"man on right","ref_id":7610},{"sent":"boy in middle","ref_id":7611},{"sent":"man on right","ref_id":7612},{"sent":"man in front","ref_id":7613},{"sent":"right guy","ref_id":7710},{"sent":"right guy","ref_id":7711},{"sent":"man on left","ref_id":7712},{"sent":"man in blue shirt","ref_id":7742},{"sent":"man in middle","ref_id":7743},{"sent":"left guy","ref_id":7851},{"sent":"right guy","ref_id":7852},{"sent":"right woman","ref_id":7853},{"sent":"right guy","ref_id":7854},{"sent":"left most person","ref_id":7855},{"sent":"man in white shirt on left","ref_id":7856},{"sent":"man in front","ref_id":7857},{"sent":"girl on right","ref_id":7858},{"sent":"person on left","ref_id":7890},{"sent":"person on right","ref_id":7891},{"sent":"person in background","ref_id":7929},{"sent":"player","ref_id":7930},{"sent":"kid in red","ref_id":7967},{"sent":"girl in green","ref_id":7968},{"sent":"man in white shirt","ref_id":7969},{"sent":"man in white","ref_id":7970},{"sent":"girl","ref_id":7981},{"sent":"girl in pink","ref_id":7982},{"sent":"player in white","ref_id":8005},{"sent":"player in white","ref_id":8006},{"sent":"man in front","ref_id":8086},{"sent":"right person","ref_id":8087},{"sent":"left person","ref_id":8088},{"sent":"guy on bike","ref_id":8089},{"sent":"girl in white","ref_id":8116},{"sent":"player on left","ref_id":8117},{"sent":"player in red","ref_id":8118},{"sent":"right kid","ref_id":8119},{"sent":"right guy","ref_id":8214},{"sent":"second from left","ref_id":8215},{"sent":"second from right","ref_id":8216},{"sent":"left guy","ref_id":8217},{"sent":"man in red","ref_id":8247},{"sent":"woman in black","ref_id":8248},{"sent":"man on left","ref_id":8308},{"sent":"right guy","ref_id":8309},{"sent":"right guy","ref_id":8310},{"sent":"person on right","ref_id":8354},{"sent":"girl in white","ref_id":8355},{"sent":"horse on right","ref_id":8356},{"sent":"horse","ref_id":8357},{"sent":"right UNK","ref_id":8375},{"sent":"woman","ref_id":8376},{"sent":"right UNK","ref_id":8377},{"sent":"person on right","ref_id":8378},{"sent":"man on left","ref_id":8379},{"sent":"person on right","ref_id":8413},{"sent":"man on right","ref_id":8414},{"sent":"woman","ref_id":8415},{"sent":"bride","ref_id":8416},{"sent":"man in blue","ref_id":8490},{"sent":"girl in pink","ref_id":8491},{"sent":"girl in red","ref_id":8492},{"sent":"kid in red","ref_id":8500},{"sent":"number 5","ref_id":8501},{"sent":"guy in white","ref_id":8502},{"sent":"woman in white","ref_id":8685},{"sent":"kid in white","ref_id":8686},{"sent":"man","ref_id":8687},{"sent":"woman sitting down","ref_id":8694},{"sent":"man on right","ref_id":8695},{"sent":"man on right","ref_id":8696},{"sent":"woman on left","ref_id":8697},{"sent":"man in black","ref_id":8698},{"sent":"guy on right","ref_id":8699},{"sent":"left person","ref_id":8700},{"sent":"woman","ref_id":8701},{"sent":"guy on left","ref_id":8758},{"sent":"girl on right","ref_id":8759},{"sent":"woman on right","ref_id":8760},{"sent":"woman on left","ref_id":8770},{"sent":"woman","ref_id":8771},{"sent":"man on left","ref_id":8772},{"sent":"man in white","ref_id":8773},{"sent":"man on left","ref_id":8774},{"sent":"girl in pink","ref_id":8896},{"sent":"girl in white","ref_id":8897},{"sent":"man in black shirt","ref_id":8898},{"sent":"man in black shirt","ref_id":8899},{"sent":"person in white shirt","ref_id":9238},{"sent":"man in black shirt","ref_id":9239},{"sent":"woman on right","ref_id":9255},{"sent":"man in red","ref_id":9256},{"sent":"man in black","ref_id":9493},{"sent":"man","ref_id":9494},{"sent":"woman on left","ref_id":9509},{"sent":"woman in middle","ref_id":9510},{"sent":"man in white shirt","ref_id":9511},{"sent":"second from right","ref_id":9532},{"sent":"second from left","ref_id":9533},{"sent":"guy in white shirt","ref_id":9534},{"sent":"tennis player","ref_id":9760},{"sent":"man in blue","ref_id":9761},{"sent":"woman in middle","ref_id":9799},{"sent":"man in white shirt","ref_id":9800},{"sent":"man","ref_id":9856},{"sent":"woman sitting","ref_id":9857},{"sent":"woman","ref_id":9858},{"sent":"white car behind the guy in the background","ref_id":9865},{"sent":"blue shirt","ref_id":9866},{"sent":"batter","ref_id":9867},{"sent":"guy in background behind fence","ref_id":9949},{"sent":"batter","ref_id":9950},{"sent":"woman","ref_id":10015},{"sent":"girl","ref_id":10016},{"sent":"white mug","ref_id":10047},{"sent":"glass on left","ref_id":10048},{"sent":"woman","ref_id":10049},{"sent":"man on left","ref_id":10114},{"sent":"guy in white shirt","ref_id":10115},{"sent":"girl in pink dress","ref_id":10136},{"sent":"woman in pink","ref_id":10137},{"sent":"person in white shirt","ref_id":10138},{"sent":"left person","ref_id":10139},{"sent":"man in black on right","ref_id":10140},{"sent":"woman in pink","ref_id":10141},{"sent":"player in white","ref_id":10217},{"sent":"red shirt","ref_id":10218},{"sent":"guy in yellow","ref_id":10219},{"sent":"man on left","ref_id":10271},{"sent":"man on right","ref_id":10272},{"sent":"batter","ref_id":10332},{"sent":"right bear","ref_id":10374},{"sent":"bear in red","ref_id":10375},{"sent":"girl in pink","ref_id":10376},{"sent":"boy in middle","ref_id":10377},{"sent":"left guy","ref_id":10412},{"sent":"right guy","ref_id":10413},{"sent":"bottom right girl","ref_id":10449},{"sent":"bottom left head","ref_id":10450},{"sent":"man on left","ref_id":10554},{"sent":"woman","ref_id":10555},{"sent":"right guy","ref_id":10629},{"sent":"man on right","ref_id":10630},{"sent":"man in black shirt","ref_id":10749},{"sent":"man on left","ref_id":10750},{"sent":"woman eating","ref_id":10879},{"sent":"woman in white shirt","ref_id":10895},{"sent":"man in blue shirt","ref_id":10896},{"sent":"girl in red","ref_id":10995},{"sent":"woman","ref_id":10996},{"sent":"couch on right","ref_id":11019},{"sent":"dog","ref_id":11020},{"sent":"woman on left","ref_id":11021},{"sent":"man on right","ref_id":11022},{"sent":"left couch","ref_id":11023},{"sent":"right couch","ref_id":11024},{"sent":"right person","ref_id":11025},{"sent":"woman on left","ref_id":11026},{"sent":"man in middle","ref_id":11027},{"sent":"white pants","ref_id":11032},{"sent":"person in black","ref_id":11033},{"sent":"right kid","ref_id":11101},{"sent":"right kid","ref_id":11102},{"sent":"kid in middle","ref_id":11103},{"sent":"right kid","ref_id":11104},{"sent":"middle person","ref_id":11159},{"sent":"woman on left","ref_id":11160},{"sent":"woman in front","ref_id":11161},{"sent":"man in black shirt and jeans","ref_id":11203},{"sent":"man in blue shirt","ref_id":11204},{"sent":"woman in black","ref_id":11205},{"sent":"catcher","ref_id":11224},{"sent":"batter","ref_id":11225},{"sent":"arm","ref_id":11372},{"sent":"arm","ref_id":11373},{"sent":"man in white","ref_id":11409},{"sent":"guy in red","ref_id":11410},{"sent":"batter","ref_id":11411},{"sent":"woman in red","ref_id":11541},{"sent":"woman on left","ref_id":11637},{"sent":"man on right","ref_id":11638},{"sent":"woman in black","ref_id":11639},{"sent":"person on left","ref_id":11676},{"sent":"person on left","ref_id":11677},{"sent":"girl on left","ref_id":11752},{"sent":"woman","ref_id":11753},{"sent":"man on left","ref_id":11757},{"sent":"right sheep","ref_id":11758},{"sent":"person in front","ref_id":11769},{"sent":"arm","ref_id":11770},{"sent":"right guy","ref_id":11894},{"sent":"blue shirt","ref_id":11895},{"sent":"catcher","ref_id":12001},{"sent":"batter","ref_id":12002},{"sent":"woman on left","ref_id":12023},{"sent":"man","ref_id":12024},{"sent":"second from right","ref_id":12025},{"sent":"man on right","ref_id":12026},{"sent":"man on right","ref_id":12027},{"sent":"second from left","ref_id":12028},{"sent":"man on left","ref_id":12029},{"sent":"second from right","ref_id":12030},{"sent":"woman in red","ref_id":12261},{"sent":"right front animal","ref_id":12262},{"sent":"front cow","ref_id":12263},{"sent":"man in white","ref_id":12357},{"sent":"man","ref_id":12358},{"sent":"person on right","ref_id":12665},{"sent":"woman","ref_id":12666},{"sent":"man in blue","ref_id":12719},{"sent":"woman in red","ref_id":12720},{"sent":"woman in black","ref_id":12721},{"sent":"woman","ref_id":12758},{"sent":"man in blue","ref_id":12759},{"sent":"girl on right","ref_id":12963},{"sent":"woman","ref_id":12964},{"sent":"man on left","ref_id":13055},{"sent":"man on left","ref_id":13056},{"sent":"man on right","ref_id":13057},{"sent":"woman","ref_id":13087},{"sent":"man","ref_id":13088},{"sent":"girl in white","ref_id":13209},{"sent":"man on left","ref_id":13210},{"sent":"bowl of food in front","ref_id":13236},{"sent":"chair top left","ref_id":13237},{"sent":"bowl of food in front","ref_id":13238},{"sent":"top right corner","ref_id":13239},{"sent":"top right corner","ref_id":13373},{"sent":"table top left","ref_id":13374},{"sent":"guy in white shirt","ref_id":13382},{"sent":"guy in white","ref_id":13383},{"sent":"guy on right","ref_id":13386},{"sent":"man on left","ref_id":13387},{"sent":"table cloth","ref_id":13410},{"sent":"woman on left","ref_id":13411},{"sent":"girl on right","ref_id":13412},{"sent":"guy in white shirt","ref_id":13439},{"sent":"man in blue shirt","ref_id":13440},{"sent":"girl in white","ref_id":13441},{"sent":"man in white shirt","ref_id":13442},{"sent":"girl in middle","ref_id":13625},{"sent":"girl on left","ref_id":13626},{"sent":"girl in pink","ref_id":13627},{"sent":"man in blue shirt","ref_id":13793},{"sent":"woman in red","ref_id":13794},{"sent":"girl","ref_id":13869},{"sent":"woman","ref_id":13870},{"sent":"person on left","ref_id":13894},{"sent":"woman in black","ref_id":13895},{"sent":"top left corner","ref_id":14024},{"sent":"left pizza","ref_id":14025},{"sent":"right pizza","ref_id":14026},{"sent":"top right corner","ref_id":14027},{"sent":"white shirt left","ref_id":14038},{"sent":"person in red","ref_id":14039},{"sent":"left bed","ref_id":14040},{"sent":"right person","ref_id":14041},{"sent":"red thing on left","ref_id":14042},{"sent":"person on left","ref_id":14102},{"sent":"person on right","ref_id":14103},{"sent":"person in front","ref_id":14104},{"sent":"person in middle","ref_id":14105},{"sent":"person in blue","ref_id":14106},{"sent":"left chair","ref_id":14201},{"sent":"woman","ref_id":14202},{"sent":"chair on left","ref_id":14203},{"sent":"chair in front of woman","ref_id":14204},{"sent":"man in front","ref_id":14270},{"sent":"man in white shirt","ref_id":14271},{"sent":"bike on right","ref_id":14272},{"sent":"bike in front","ref_id":14273},{"sent":"batter","ref_id":14274},{"sent":"batter","ref_id":14275},{"sent":"umpire","ref_id":14276},{"sent":"batter","ref_id":14277},{"sent":"man","ref_id":14316},{"sent":"woman","ref_id":14317},{"sent":"kid on right","ref_id":14352},{"sent":"kid in middle","ref_id":14353},{"sent":"girl in pink","ref_id":14377},{"sent":"girl on left","ref_id":14378},{"sent":"girl","ref_id":14379},{"sent":"woman in pink","ref_id":14380},{"sent":"woman","ref_id":14482},{"sent":"woman","ref_id":14483},{"sent":"woman in black","ref_id":14519},{"sent":"man in middle","ref_id":14520},{"sent":"hand holding phone","ref_id":14521},{"sent":"woman in black","ref_id":14522},{"sent":"woman in green","ref_id":14523},{"sent":"woman in black","ref_id":14524},{"sent":"man on left","ref_id":14601},{"sent":"left guy","ref_id":14602},{"sent":"guy in black shirt","ref_id":14603},{"sent":"right person","ref_id":14694},{"sent":"guy in blue","ref_id":14695},{"sent":"umpire","ref_id":14755},{"sent":"catcher","ref_id":14756},{"sent":"batter","ref_id":14757},{"sent":"right person","ref_id":14855},{"sent":"left person","ref_id":14856},{"sent":"person on right","ref_id":14857},{"sent":"man in black shirt","ref_id":14869},{"sent":"man in white","ref_id":14870},{"sent":"kid on left","ref_id":14883},{"sent":"kid on right","ref_id":14884},{"sent":"right guy","ref_id":14940},{"sent":"girl in white","ref_id":14941},{"sent":"man on left","ref_id":14968},{"sent":"red bus","ref_id":14969},{"sent":"man in white shirt","ref_id":14981},{"sent":"man in white","ref_id":14982},{"sent":"girl in white shirt","ref_id":15085},{"sent":"man in white shirt","ref_id":15086},{"sent":"girl in white","ref_id":15087},{"sent":"white table in front of girl","ref_id":15088},{"sent":"arm on left","ref_id":15089},{"sent":"right person","ref_id":15092},{"sent":"woman in white","ref_id":15093},{"sent":"man in blue shirt","ref_id":15094},{"sent":"woman on left","ref_id":15253},{"sent":"person on right","ref_id":15254},{"sent":"white shirt","ref_id":15255},{"sent":"woman in white","ref_id":15342},{"sent":"man","ref_id":15343},{"sent":"hand on left","ref_id":15348},{"sent":"right person","ref_id":15349},{"sent":"woman on left","ref_id":15366},{"sent":"left person","ref_id":15367},{"sent":"woman on right","ref_id":15368},{"sent":"person in white shirt","ref_id":15369},{"sent":"person on left","ref_id":15370},{"sent":"guy","ref_id":15394},{"sent":"blue shirt","ref_id":15432},{"sent":"baby","ref_id":15433},{"sent":"woman in red","ref_id":15555},{"sent":"woman in blue","ref_id":15556},{"sent":"left person","ref_id":15563},{"sent":"man in white","ref_id":15564},{"sent":"woman","ref_id":15699},{"sent":"bottom left corner","ref_id":15754},{"sent":"man in white","ref_id":15755},{"sent":"right guy","ref_id":15825},{"sent":"man in suit","ref_id":15826},{"sent":"man on right","ref_id":15986},{"sent":"man","ref_id":15987},{"sent":"top right corner","ref_id":16068},{"sent":"tennis player","ref_id":16069},{"sent":"girl","ref_id":16077},{"sent":"man","ref_id":16078},{"sent":"left woman","ref_id":16126},{"sent":"woman on left","ref_id":16127},{"sent":"woman in middle","ref_id":16128},{"sent":"woman in middle","ref_id":16129},{"sent":"man in black suit","ref_id":16130},{"sent":"man in middle","ref_id":16131},{"sent":"woman","ref_id":16200},{"sent":"woman","ref_id":16201},{"sent":"right player","ref_id":16425},{"sent":"left player","ref_id":16426},{"sent":"right person","ref_id":16543},{"sent":"right person","ref_id":16544},{"sent":"guy in red","ref_id":16545},{"sent":"left blue","ref_id":16566},{"sent":"pink","ref_id":16567},{"sent":"person in white","ref_id":16568},{"sent":"person in front","ref_id":16569},{"sent":"horse","ref_id":16636},{"sent":"man in front","ref_id":16732},{"sent":"man in white","ref_id":16738},{"sent":"right guy","ref_id":16739},{"sent":"right person","ref_id":16740},{"sent":"man on left","ref_id":16741},{"sent":"right guy","ref_id":16786},{"sent":"red shirt","ref_id":16787},{"sent":"man in black","ref_id":16788},{"sent":"man in middle","ref_id":16804},{"sent":"woman on left","ref_id":16805},{"sent":"man in suit","ref_id":16892},{"sent":"top left corner","ref_id":16896},{"sent":"hand on left","ref_id":16897},{"sent":"person on left","ref_id":16898},{"sent":"head of person in front of girl","ref_id":17039},{"sent":"girl","ref_id":17040},{"sent":"left guy","ref_id":17138},{"sent":"woman in black","ref_id":17139},{"sent":"catcher","ref_id":17322},{"sent":"umpire","ref_id":17323},{"sent":"batter","ref_id":17324},{"sent":"man","ref_id":17488},{"sent":"girl","ref_id":17489},{"sent":"top right corner","ref_id":17497},{"sent":"person on left","ref_id":17498},{"sent":"white sheep","ref_id":17523},{"sent":"man on right","ref_id":17524},{"sent":"man on left","ref_id":17545},{"sent":"man on right","ref_id":17546},{"sent":"left person","ref_id":17579},{"sent":"right guy","ref_id":17580},{"sent":"catcher","ref_id":17622},{"sent":"player","ref_id":17623},{"sent":"right side of pizza","ref_id":17629},{"sent":"baby","ref_id":17630},{"sent":"woman on left","ref_id":17643},{"sent":"girl in front","ref_id":17644},{"sent":"woman in black","ref_id":17715},{"sent":"woman in black","ref_id":17716},{"sent":"man on left","ref_id":17717},{"sent":"man in white shirt","ref_id":17731},{"sent":"left guy","ref_id":17732},{"sent":"woman in middle","ref_id":17906},{"sent":"woman in white","ref_id":17907},{"sent":"batter","ref_id":17974},{"sent":"catcher","ref_id":17975},{"sent":"catcher","ref_id":17976},{"sent":"batter","ref_id":17977},{"sent":"guy on right","ref_id":17986},{"sent":"guy","ref_id":17987},{"sent":"woman in white shirt","ref_id":18064},{"sent":"woman in white shirt","ref_id":18065},{"sent":"man in black shirt","ref_id":18066},{"sent":"man on right","ref_id":18067},{"sent":"woman in black","ref_id":18127},{"sent":"woman in black","ref_id":18128},{"sent":"bride","ref_id":18162},{"sent":"woman","ref_id":18163},{"sent":"man","ref_id":18164},{"sent":"right couch","ref_id":18167},{"sent":"man on left","ref_id":18168},{"sent":"right couch","ref_id":18169},{"sent":"left couch","ref_id":18170},{"sent":"man on right","ref_id":18274},{"sent":"table in front of man","ref_id":18275},{"sent":"man in blue","ref_id":18276},{"sent":"man on right","ref_id":18277},{"sent":"bottom right corner","ref_id":18278},{"sent":"right guy","ref_id":18297},{"sent":"tie","ref_id":18298},{"sent":"man in white","ref_id":18325},{"sent":"man in background","ref_id":18326},{"sent":"umbrella","ref_id":18362},{"sent":"person on left","ref_id":18363},{"sent":"pink umbrella","ref_id":18364},{"sent":"girl in pink","ref_id":18448},{"sent":"girl in pink","ref_id":18449},{"sent":"right guy","ref_id":18488},{"sent":"man on right","ref_id":18489},{"sent":"man in blue shirt","ref_id":18490},{"sent":"kid in white shirt","ref_id":18584},{"sent":"guy in white shirt","ref_id":18585},{"sent":"red shirt","ref_id":18586},{"sent":"guy on left","ref_id":18701},{"sent":"left guy","ref_id":18702},{"sent":"guy on right","ref_id":18703},{"sent":"guy in front","ref_id":18704},{"sent":"person on left","ref_id":18738},{"sent":"woman","ref_id":18739},{"sent":"woman","ref_id":18740},{"sent":"woman on right","ref_id":18798},{"sent":"man on right","ref_id":18799},{"sent":"woman in black","ref_id":18800},{"sent":"person in white","ref_id":18804},{"sent":"right guy","ref_id":18805},{"sent":"right UNK","ref_id":18806},{"sent":"right person","ref_id":18846},{"sent":"left person","ref_id":18847},{"sent":"man in white","ref_id":18888},{"sent":"guy on right","ref_id":18889},{"sent":"person on left","ref_id":18912},{"sent":"man","ref_id":18913},{"sent":"right guy","ref_id":18914},{"sent":"red shirt","ref_id":18931},{"sent":"white shirt right","ref_id":18932},{"sent":"person in background in background","ref_id":18933},{"sent":"blurry person in background behind the tennis player","ref_id":18934},{"sent":"blurry person in background on left","ref_id":18935},{"sent":"guy in red shirt","ref_id":18936},{"sent":"guy in white shirt","ref_id":18937},{"sent":"tennis player","ref_id":18938},{"sent":"girl","ref_id":19008},{"sent":"person on left","ref_id":19040},{"sent":"girl in yellow","ref_id":19062},{"sent":"bottom left head","ref_id":19063},{"sent":"man on left","ref_id":19064},{"sent":"right bottom corner","ref_id":19132},{"sent":"person in white on left","ref_id":19133},{"sent":"man in black","ref_id":19134},{"sent":"bottom left corner","ref_id":19279},{"sent":"man","ref_id":19280},{"sent":"girl on right","ref_id":19325},{"sent":"kid","ref_id":19326},{"sent":"girl on right","ref_id":19327},{"sent":"man in blue","ref_id":19348},{"sent":"man on right","ref_id":19349},{"sent":"man on left","ref_id":19428},{"sent":"guy in middle","ref_id":19429},{"sent":"guy on right","ref_id":19430},{"sent":"woman in pink","ref_id":19433},{"sent":"man in blue shirt","ref_id":19434},{"sent":"girl in red","ref_id":19448},{"sent":"left girl","ref_id":19449},{"sent":"girl in red","ref_id":19450},{"sent":"right guy","ref_id":19451},{"sent":"girl on right","ref_id":19469},{"sent":"pizza on right","ref_id":19470},{"sent":"girl","ref_id":19471},{"sent":"woman in white","ref_id":19509},{"sent":"man in black","ref_id":19510},{"sent":"man in front","ref_id":19511},{"sent":"batter","ref_id":19512},{"sent":"guy in blue shirt behind fence","ref_id":19513},{"sent":"person in background on left","ref_id":19514},{"sent":"red shirt","ref_id":19515},{"sent":"man in red","ref_id":19516},{"sent":"person in red","ref_id":19517},{"sent":"woman on left","ref_id":19543},{"sent":"person in background","ref_id":19634},{"sent":"kid","ref_id":19635},{"sent":"left person","ref_id":19684},{"sent":"woman","ref_id":19685},{"sent":"woman in pink","ref_id":19686},{"sent":"woman on right","ref_id":19687},{"sent":"woman","ref_id":19688},{"sent":"girl on right","ref_id":19732},{"sent":"left guy","ref_id":19733},{"sent":"arm","ref_id":19743},{"sent":"right cake","ref_id":19744},{"sent":"cake","ref_id":19745},{"sent":"the arm on the left","ref_id":19746},{"sent":"bottom left hand","ref_id":19843},{"sent":"girl","ref_id":19844},{"sent":"right person","ref_id":19901},{"sent":"left person","ref_id":19902},{"sent":"red jacket","ref_id":19903},{"sent":"second from right","ref_id":19904},{"sent":"woman in red","ref_id":19941},{"sent":"woman on left","ref_id":19942},{"sent":"woman in white","ref_id":20041},{"sent":"woman on right","ref_id":20042},{"sent":"laptop on left","ref_id":20099},{"sent":"middle laptop","ref_id":20100},{"sent":"man in white shirt","ref_id":20101},{"sent":"left laptop","ref_id":20102},{"sent":"man in middle","ref_id":20103},{"sent":"man","ref_id":20246},{"sent":"woman","ref_id":20247},{"sent":"man on left","ref_id":20268},{"sent":"kid","ref_id":20269},{"sent":"right pizza","ref_id":20311},{"sent":"pizza on left","ref_id":20312},{"sent":"pizza on right","ref_id":20313},{"sent":"top right corner","ref_id":20314},{"sent":"arm in back","ref_id":20315},{"sent":"left person","ref_id":20389},{"sent":"kid","ref_id":20390},{"sent":"man in middle","ref_id":20420},{"sent":"man on left","ref_id":20421},{"sent":"top right corner","ref_id":20454},{"sent":"hand","ref_id":20455},{"sent":"woman on left","ref_id":20469},{"sent":"man in middle","ref_id":20470},{"sent":"woman on right","ref_id":20471},{"sent":"woman on right","ref_id":20479},{"sent":"man in front","ref_id":20480},{"sent":"woman","ref_id":20505},{"sent":"woman on left","ref_id":20506},{"sent":"man on left","ref_id":20512},{"sent":"girl in red","ref_id":20513},{"sent":"girl in blue","ref_id":20514},{"sent":"man","ref_id":20602},{"sent":"bride","ref_id":20603},{"sent":"man in black","ref_id":20649},{"sent":"man in black","ref_id":20650},{"sent":"person in front","ref_id":20663},{"sent":"left person","ref_id":20664},{"sent":"person on right","ref_id":20665},{"sent":"white shirt","ref_id":20666},{"sent":"person on right","ref_id":20667},{"sent":"bottom of the UNK","ref_id":20668},{"sent":"black thing on top of suitcase","ref_id":20755},{"sent":"legs on right","ref_id":20756},{"sent":"left leg","ref_id":20757},{"sent":"kid","ref_id":20791},{"sent":"girl","ref_id":20792},{"sent":"man on left","ref_id":20875},{"sent":"man on right","ref_id":20876},{"sent":"woman in middle","ref_id":20877},{"sent":"man in black","ref_id":20938},{"sent":"person on right","ref_id":20939},{"sent":"giraffe","ref_id":20940},{"sent":"giraffe on right","ref_id":20941},{"sent":"man on right","ref_id":20942},{"sent":"man in black","ref_id":20943},{"sent":"left person","ref_id":20944},{"sent":"kid in front","ref_id":20945},{"sent":"batter","ref_id":20946},{"sent":"catcher","ref_id":20947},{"sent":"umpire","ref_id":20948},{"sent":"catcher","ref_id":20954},{"sent":"top left corner","ref_id":20977},{"sent":"person in back","ref_id":20978},{"sent":"baby","ref_id":21019},{"sent":"baby","ref_id":21020},{"sent":"sheep in front","ref_id":21081},{"sent":"right kid","ref_id":21082},{"sent":"girl in pink","ref_id":21083},{"sent":"sheep in front","ref_id":21084},{"sent":"girl on left","ref_id":21122},{"sent":"man on right","ref_id":21123},{"sent":"man in black","ref_id":21124},{"sent":"woman in white","ref_id":21125},{"sent":"left guy","ref_id":21190},{"sent":"woman","ref_id":21191},{"sent":"man","ref_id":21292},{"sent":"white tie","ref_id":21293},{"sent":"UNK","ref_id":21294},{"sent":"left tie","ref_id":21295},{"sent":"left person","ref_id":21296},{"sent":"woman in white","ref_id":21302},{"sent":"man in white","ref_id":21303},{"sent":"girl on left","ref_id":21410},{"sent":"girl on right","ref_id":21411},{"sent":"number 18","ref_id":21422},{"sent":"man in blue shirt","ref_id":21423},{"sent":"number 2","ref_id":21424},{"sent":"left player","ref_id":21425},{"sent":"number 18","ref_id":21426},{"sent":"second from left","ref_id":21433},{"sent":"second board from right","ref_id":21434},{"sent":"right person","ref_id":21435},{"sent":"second from right","ref_id":21436},{"sent":"second from left","ref_id":21437},{"sent":"middle person","ref_id":21438},{"sent":"left person","ref_id":21439},{"sent":"man on left","ref_id":21440},{"sent":"right girl","ref_id":21444},{"sent":"man in white","ref_id":21525},{"sent":"person on right","ref_id":21580},{"sent":"person on left","ref_id":21581},{"sent":"man in red shirt","ref_id":21607},{"sent":"man on left","ref_id":21608},{"sent":"woman in front","ref_id":21609},{"sent":"pizza slice on left","ref_id":21616},{"sent":"pizza slice","ref_id":21617},{"sent":"hand on left","ref_id":21618},{"sent":"white shirt upper right","ref_id":21619},{"sent":"bottom left corner","ref_id":21798},{"sent":"woman","ref_id":21799},{"sent":"girl in blue","ref_id":22059},{"sent":"girl in pink","ref_id":22060},{"sent":"kid on right","ref_id":22061},{"sent":"girl in pink","ref_id":22062},{"sent":"girl in pink","ref_id":22063},{"sent":"girl in white","ref_id":22088},{"sent":"left guy","ref_id":22089},{"sent":"guy in blue","ref_id":22117},{"sent":"red shirt","ref_id":22118},{"sent":"woman","ref_id":22475},{"sent":"girl in yellow","ref_id":22476},{"sent":"right racket","ref_id":22477},{"sent":"man in blue shirt","ref_id":22504},{"sent":"woman on left","ref_id":22505},{"sent":"woman on right","ref_id":22659},{"sent":"woman on left","ref_id":22660},{"sent":"woman in black","ref_id":22715},{"sent":"woman in black","ref_id":22716},{"sent":"man in white","ref_id":22717},{"sent":"woman on right","ref_id":22718},{"sent":"person in white shirt","ref_id":22796},{"sent":"woman in white","ref_id":22797},{"sent":"left person","ref_id":22798},{"sent":"right guy","ref_id":22862},{"sent":"left player","ref_id":22863},{"sent":"woman","ref_id":23015},{"sent":"woman in black","ref_id":23016},{"sent":"person on left","ref_id":23077},{"sent":"man","ref_id":23078},{"sent":"red shirt","ref_id":23129},{"sent":"red shirt","ref_id":23130},{"sent":"girl in pink","ref_id":23131},{"sent":"woman on right","ref_id":23179},{"sent":"woman in black","ref_id":23180},{"sent":"man on left","ref_id":23192},{"sent":"man on right","ref_id":23193},{"sent":"man in white shirt","ref_id":23194},{"sent":"man on left","ref_id":23235},{"sent":"woman in white","ref_id":23236},{"sent":"woman in middle","ref_id":23237},{"sent":"man in middle","ref_id":23249},{"sent":"man on right","ref_id":23250},{"sent":"man on left","ref_id":23251},{"sent":"left person","ref_id":23254},{"sent":"man in white shirt","ref_id":23255},{"sent":"right person","ref_id":23256},{"sent":"catcher","ref_id":23358},{"sent":"umpire","ref_id":23359},{"sent":"man in white","ref_id":23410},{"sent":"woman in red","ref_id":23411},{"sent":"girl in pink","ref_id":23412},{"sent":"baby","ref_id":23564},{"sent":"baby","ref_id":23565},{"sent":"bottom right bowl","ref_id":23652},{"sent":"woman in white","ref_id":23653},{"sent":"woman in white","ref_id":23654},{"sent":"top right corner","ref_id":23857},{"sent":"man in white shirt","ref_id":23863},{"sent":"guy in blue","ref_id":23864},{"sent":"guy in blue shirt","ref_id":23865},{"sent":"girl in yellow","ref_id":23866},{"sent":"man on left","ref_id":23890},{"sent":"woman on right","ref_id":23891},{"sent":"girl","ref_id":23910},{"sent":"girl","ref_id":23911},{"sent":"woman in red","ref_id":23919},{"sent":"person in front","ref_id":23920},{"sent":"right guy","ref_id":24045},{"sent":"woman in white","ref_id":24046},{"sent":"woman in middle","ref_id":24058},{"sent":"man in middle","ref_id":24059},{"sent":"man in white shirt","ref_id":24060},{"sent":"man on right","ref_id":24061},{"sent":"man on left","ref_id":24062},{"sent":"woman in purple","ref_id":24063},{"sent":"person on right","ref_id":24064},{"sent":"woman in middle","ref_id":24237},{"sent":"baby","ref_id":24238},{"sent":"left guy","ref_id":24265},{"sent":"man on right","ref_id":24266},{"sent":"man in black","ref_id":24267},{"sent":"right player","ref_id":24297},{"sent":"left player","ref_id":24298},{"sent":"man on left","ref_id":24320},{"sent":"man in white","ref_id":24321},{"sent":"man on right","ref_id":24322},{"sent":"man on right","ref_id":24323},{"sent":"man in blue","ref_id":24369},{"sent":"left glass","ref_id":24370},{"sent":"man in red","ref_id":24371},{"sent":"left guy","ref_id":24381},{"sent":"person on right","ref_id":24382},{"sent":"man on right","ref_id":24396},{"sent":"man in black","ref_id":24397},{"sent":"girl on left","ref_id":24454},{"sent":"girl in white","ref_id":24455},{"sent":"man on right","ref_id":24456},{"sent":"man in white","ref_id":24491},{"sent":"person on left","ref_id":24492},{"sent":"woman on right","ref_id":24493},{"sent":"man in blue shirt","ref_id":24510},{"sent":"woman in back","ref_id":24511},{"sent":"woman","ref_id":24512},{"sent":"man in white","ref_id":24513},{"sent":"right person","ref_id":24525},{"sent":"left guy","ref_id":24526},{"sent":"boy in white","ref_id":24527},{"sent":"right hot dog","ref_id":24891},{"sent":"hot dog on left","ref_id":24892},{"sent":"hand on left","ref_id":24893},{"sent":"arm on right","ref_id":24894},{"sent":"man on left","ref_id":24895},{"sent":"man in red shirt","ref_id":24896},{"sent":"woman in white","ref_id":24897},{"sent":"left guy","ref_id":24938},{"sent":"girl in white","ref_id":24939},{"sent":"white shirt","ref_id":25002},{"sent":"man in black shirt on left","ref_id":25003},{"sent":"man in white shirt","ref_id":25004},{"sent":"person in white shirt","ref_id":25077},{"sent":"man","ref_id":25078},{"sent":"person on right","ref_id":25334},{"sent":"man","ref_id":25335},{"sent":"woman","ref_id":25359},{"sent":"man","ref_id":25360},{"sent":"top left black shirt","ref_id":25386},{"sent":"top right corner","ref_id":25387},{"sent":"batter","ref_id":25419},{"sent":"batter","ref_id":25420},{"sent":"right player","ref_id":25471},{"sent":"left player","ref_id":25472},{"sent":"person on left","ref_id":25546},{"sent":"guy in white","ref_id":25547},{"sent":"kid","ref_id":25548},{"sent":"woman","ref_id":25631},{"sent":"woman in white","ref_id":25632},{"sent":"man on right","ref_id":25753},{"sent":"boy in yellow","ref_id":25754},{"sent":"white shirt","ref_id":25800},{"sent":"woman","ref_id":25801},{"sent":"hand","ref_id":25802},{"sent":"umpire","ref_id":25818},{"sent":"batter","ref_id":25819},{"sent":"man","ref_id":26042},{"sent":"girl","ref_id":26043},{"sent":"man on right","ref_id":26086},{"sent":"man in black","ref_id":26087},{"sent":"man in black shirt","ref_id":26088},{"sent":"man on left","ref_id":26089},{"sent":"left most person","ref_id":26263},{"sent":"second board from right","ref_id":26264},{"sent":"man on left","ref_id":26265},{"sent":"guy in middle","ref_id":26266},{"sent":"person in middle","ref_id":26267},{"sent":"man","ref_id":26498},{"sent":"woman","ref_id":26499},{"sent":"guy on right","ref_id":26509},{"sent":"kid in white","ref_id":26510},{"sent":"person on right","ref_id":26571},{"sent":"person on right","ref_id":26572},{"sent":"batter","ref_id":26628},{"sent":"umpire","ref_id":26629},{"sent":"catcher","ref_id":26630},{"sent":"person on left","ref_id":26631},{"sent":"man in white shirt","ref_id":26632},{"sent":"woman in black dress","ref_id":26633},{"sent":"woman in black","ref_id":26634},{"sent":"person in white shirt","ref_id":26684},{"sent":"man in white","ref_id":26685},{"sent":"man in middle","ref_id":26686},{"sent":"woman in front","ref_id":26698},{"sent":"woman","ref_id":26699},{"sent":"man in front","ref_id":26744},{"sent":"right person","ref_id":26745},{"sent":"table on right","ref_id":26749},{"sent":"right kid","ref_id":26750},{"sent":"boy in white shirt","ref_id":26751},{"sent":"left table","ref_id":26752},{"sent":"person on right","ref_id":26856},{"sent":"left person","ref_id":26857},{"sent":"bottom left head","ref_id":26858},{"sent":"guy in white","ref_id":26877},{"sent":"guy in black","ref_id":26878},{"sent":"man in front","ref_id":26977},{"sent":"person on left","ref_id":26978},{"sent":"kid in red","ref_id":27212},{"sent":"kid in red","ref_id":27213},{"sent":"left person","ref_id":27366},{"sent":"bottom left corner","ref_id":27367},{"sent":"bottom sandwich","ref_id":27368},{"sent":"boy in blue","ref_id":27369},{"sent":"man in blue","ref_id":27370},{"sent":"person on right","ref_id":27447},{"sent":"person on left","ref_id":27448},{"sent":"man on left with hat","ref_id":27489},{"sent":"horse in front","ref_id":27490},{"sent":"horse on right","ref_id":27491},{"sent":"man in front with blue hat","ref_id":27492},{"sent":"horse on left","ref_id":27493},{"sent":"man in blue shirt","ref_id":27550},{"sent":"man in white","ref_id":27551},{"sent":"person on right","ref_id":27643},{"sent":"person in middle","ref_id":27644},{"sent":"chef on right","ref_id":27684},{"sent":"left chef","ref_id":27685},{"sent":"bottom right phone","ref_id":27717},{"sent":"bottom left hand","ref_id":27718},{"sent":"man on right","ref_id":27728},{"sent":"man in white shirt","ref_id":27729},{"sent":"man","ref_id":27775},{"sent":"bride","ref_id":27776},{"sent":"man in white shirt","ref_id":27946},{"sent":"woman in white","ref_id":27947},{"sent":"blue shirt","ref_id":27948},{"sent":"woman in white","ref_id":27949},{"sent":"bottom right corner","ref_id":27950},{"sent":"man on right","ref_id":27957},{"sent":"man","ref_id":27958},{"sent":"person in front","ref_id":28059},{"sent":"person in middle","ref_id":28060},{"sent":"the umbrella","ref_id":28061},{"sent":"man in white","ref_id":28068},{"sent":"man in blue","ref_id":28069},{"sent":"man in white shirt","ref_id":28070},{"sent":"person on left","ref_id":28146},{"sent":"man in white","ref_id":28147},{"sent":"person in blue","ref_id":28386},{"sent":"person in front","ref_id":28387},{"sent":"woman","ref_id":28388},{"sent":"man on right","ref_id":28389},{"sent":"woman","ref_id":28390},{"sent":"catcher","ref_id":28414},{"sent":"batter","ref_id":28415},{"sent":"baby","ref_id":28479},{"sent":"baby","ref_id":28480},{"sent":"left guy","ref_id":28761},{"sent":"right person","ref_id":28762},{"sent":"right skier","ref_id":28788},{"sent":"middle person","ref_id":28789},{"sent":"blue shirt","ref_id":28790},{"sent":"girl in pink","ref_id":28791},{"sent":"baby","ref_id":28792},{"sent":"arm on left","ref_id":28825},{"sent":"arm on right","ref_id":28826},{"sent":"man in white shirt","ref_id":28827},{"sent":"woman in black","ref_id":28874},{"sent":"boy in blue shirt","ref_id":28875},{"sent":"man on left","ref_id":29002},{"sent":"right guy","ref_id":29003},{"sent":"person on left","ref_id":29017},{"sent":"person on right","ref_id":29018},{"sent":"man in white shirt","ref_id":29100},{"sent":"guy in red shirt","ref_id":29101},{"sent":"player in front","ref_id":29102},{"sent":"girl","ref_id":29236},{"sent":"left girl","ref_id":29237},{"sent":"person on right","ref_id":29341},{"sent":"tennis player","ref_id":29342},{"sent":"guy on bike","ref_id":29390},{"sent":"left bike","ref_id":29391},{"sent":"bike on right","ref_id":29392},{"sent":"bike","ref_id":29393},{"sent":"left","ref_id":29417},{"sent":"woman","ref_id":29418},{"sent":"batter","ref_id":29448},{"sent":"batter","ref_id":29449},{"sent":"man in blue shirt","ref_id":29536},{"sent":"man in white shirt","ref_id":29537},{"sent":"man in white","ref_id":29538},{"sent":"right person","ref_id":29560},{"sent":"tennis player","ref_id":29561},{"sent":"left person","ref_id":29637},{"sent":"right person","ref_id":29638},{"sent":"woman in front","ref_id":29639},{"sent":"woman in black","ref_id":29640},{"sent":"person on right","ref_id":29677},{"sent":"left bed","ref_id":29678},{"sent":"woman on left","ref_id":29811},{"sent":"man","ref_id":29812},{"sent":"person in front","ref_id":29882},{"sent":"woman","ref_id":29883},{"sent":"girl","ref_id":29908},{"sent":"cake in front of cake","ref_id":29909},{"sent":"baby","ref_id":29910},{"sent":"man in black shirt","ref_id":29956},{"sent":"kid in white","ref_id":29957},{"sent":"man on left","ref_id":30096},{"sent":"guy in back","ref_id":30097},{"sent":"man in white shirt","ref_id":30098},{"sent":"man","ref_id":30099},{"sent":"woman in red","ref_id":30357},{"sent":"person on right","ref_id":30358},{"sent":"man","ref_id":30469},{"sent":"kid","ref_id":30470},{"sent":"woman","ref_id":30495},{"sent":"man","ref_id":30496},{"sent":"left person","ref_id":30516},{"sent":"woman","ref_id":30517},{"sent":"right person","ref_id":30525},{"sent":"girl","ref_id":30526},{"sent":"right girl","ref_id":30556},{"sent":"second from left","ref_id":30557},{"sent":"girl on right","ref_id":30558},{"sent":"girl in white","ref_id":30559},{"sent":"girl in middle","ref_id":30560},{"sent":"girl on left","ref_id":30561},{"sent":"right girl","ref_id":30562},{"sent":"second from left","ref_id":30563},{"sent":"man on right","ref_id":30637},{"sent":"right laptop","ref_id":30638},{"sent":"bottom left laptop","ref_id":30639},{"sent":"woman on left","ref_id":30640},{"sent":"man","ref_id":30676},{"sent":"man on right","ref_id":30677},{"sent":"left person","ref_id":30731},{"sent":"person in black","ref_id":30732},{"sent":"left guy","ref_id":30769},{"sent":"right guy","ref_id":30770},{"sent":"arm on left","ref_id":30803},{"sent":"girl in red","ref_id":30804},{"sent":"man on left","ref_id":30805},{"sent":"hand holding scissors","ref_id":30888},{"sent":"hand","ref_id":30889},{"sent":"left girl","ref_id":30959},{"sent":"baby","ref_id":30960},{"sent":"left bench","ref_id":31198},{"sent":"right bike","ref_id":31199},{"sent":"man in white shirt","ref_id":31201},{"sent":"person on right","ref_id":31202},{"sent":"woman","ref_id":31203},{"sent":"woman on right","ref_id":31206},{"sent":"woman","ref_id":31207},{"sent":"batter","ref_id":31356},{"sent":"catcher","ref_id":31357},{"sent":"batter","ref_id":31358},{"sent":"left edge of pic","ref_id":31459},{"sent":"arm on right","ref_id":31460},{"sent":"man on right","ref_id":31461},{"sent":"man","ref_id":31462},{"sent":"catcher","ref_id":31505},{"sent":"batter","ref_id":31506},{"sent":"girl on right","ref_id":31547},{"sent":"woman","ref_id":31548},{"sent":"chair on left","ref_id":31552},{"sent":"woman","ref_id":31553},{"sent":"girl","ref_id":31554},{"sent":"girl","ref_id":31555},{"sent":"left guy","ref_id":31572},{"sent":"right girl","ref_id":31573},{"sent":"right woman","ref_id":31597},{"sent":"man on right","ref_id":31598},{"sent":"woman in middle","ref_id":31599},{"sent":"woman in middle","ref_id":31600},{"sent":"man on left","ref_id":31601},{"sent":"player in white","ref_id":31762},{"sent":"red shirt right","ref_id":31763},{"sent":"blue shirt","ref_id":31764},{"sent":"guy in white shirt","ref_id":31765},{"sent":"person in black","ref_id":31799},{"sent":"person on right","ref_id":31800},{"sent":"man on right","ref_id":31816},{"sent":"man in suit","ref_id":31817},{"sent":"man on left","ref_id":31818},{"sent":"person on left","ref_id":31862},{"sent":"girl","ref_id":31863},{"sent":"person in black on right","ref_id":31866},{"sent":"yellow shirt","ref_id":31867},{"sent":"man","ref_id":31955},{"sent":"girl","ref_id":31956},{"sent":"left girl","ref_id":32164},{"sent":"boy on right","ref_id":32217},{"sent":"boy in blue","ref_id":32218},{"sent":"man on right","ref_id":32234},{"sent":"man","ref_id":32235},{"sent":"baby","ref_id":32298},{"sent":"man on left","ref_id":32299},{"sent":"black suitcase","ref_id":32432},{"sent":"black bag","ref_id":32433},{"sent":"catcher","ref_id":32508},{"sent":"batter","ref_id":32509},{"sent":"woman on left","ref_id":32582},{"sent":"girl in white","ref_id":32583},{"sent":"batter","ref_id":32584},{"sent":"umpire","ref_id":32585},{"sent":"woman","ref_id":32644},{"sent":"girl","ref_id":32645},{"sent":"right sheep","ref_id":32646},{"sent":"left sheep","ref_id":32647},{"sent":"right guy","ref_id":32804},{"sent":"man in middle","ref_id":32805},{"sent":"man on left","ref_id":32806},{"sent":"man in middle","ref_id":32807},{"sent":"man in white","ref_id":32850},{"sent":"guy in white shirt","ref_id":32851},{"sent":"man in white shirt","ref_id":33044},{"sent":"man in blue shirt","ref_id":33045},{"sent":"woman in white","ref_id":33046},{"sent":"man on right","ref_id":33056},{"sent":"man in middle","ref_id":33057},{"sent":"man","ref_id":33058},{"sent":"man","ref_id":33097},{"sent":"man on right","ref_id":33098},{"sent":"bottom right corner","ref_id":33327},{"sent":"man in black shirt","ref_id":33328},{"sent":"catcher","ref_id":33462},{"sent":"batter","ref_id":33463},{"sent":"man on right","ref_id":33599},{"sent":"man","ref_id":33600},{"sent":"right person","ref_id":33622},{"sent":"man on right","ref_id":33623},{"sent":"girl in middle","ref_id":33624},{"sent":"girl on left","ref_id":33625},{"sent":"guy on right","ref_id":33631},{"sent":"left person","ref_id":33632},{"sent":"catcher","ref_id":33633},{"sent":"batter","ref_id":33634},{"sent":"donut in middle","ref_id":33696},{"sent":"hand","ref_id":33697},{"sent":"white shirt","ref_id":33698},{"sent":"man in black shirt","ref_id":33819},{"sent":"woman on right","ref_id":33820},{"sent":"woman in black","ref_id":33821},{"sent":"man in black","ref_id":33822},{"sent":"blue suitcase","ref_id":33922},{"sent":"man in black shirt","ref_id":33923},{"sent":"the seat behind the man","ref_id":33924},{"sent":"woman in black","ref_id":33925},{"sent":"woman in front","ref_id":33926},{"sent":"girl","ref_id":33990},{"sent":"right flower","ref_id":33991},{"sent":"right bottom corner","ref_id":34093},{"sent":"woman in pink","ref_id":34094},{"sent":"woman in black","ref_id":34095},{"sent":"man on left","ref_id":34221},{"sent":"man in white","ref_id":34222},{"sent":"woman in white","ref_id":34389},{"sent":"man in white shirt","ref_id":34390},{"sent":"left girl","ref_id":34391},{"sent":"girl in middle","ref_id":34443},{"sent":"girl in white","ref_id":34444},{"sent":"woman on right","ref_id":34445},{"sent":"woman on left","ref_id":34463},{"sent":"white car","ref_id":34464},{"sent":"man in white","ref_id":34478},{"sent":"man on left","ref_id":34479},{"sent":"man in blue","ref_id":34480},{"sent":"man on right","ref_id":34481},{"sent":"right guy","ref_id":34655},{"sent":"woman","ref_id":34656},{"sent":"person on left","ref_id":34659},{"sent":"girl in front","ref_id":34660},{"sent":"girl on right","ref_id":34661},{"sent":"person in white shirt","ref_id":34708},{"sent":"person in white shirt","ref_id":34709},{"sent":"person in white shirt","ref_id":34710},{"sent":"person in white shirt","ref_id":34711},{"sent":"woman in pink","ref_id":34712},{"sent":"woman in front","ref_id":34713},{"sent":"second from right","ref_id":34716},{"sent":"man on right","ref_id":34717},{"sent":"right person","ref_id":34718},{"sent":"left person","ref_id":34719},{"sent":"right woman","ref_id":34743},{"sent":"woman","ref_id":34744},{"sent":"man on right","ref_id":34745},{"sent":"batter","ref_id":34879},{"sent":"catcher","ref_id":34880},{"sent":"left kid","ref_id":35066},{"sent":"girl on right","ref_id":35067},{"sent":"kid on right","ref_id":35100},{"sent":"kid","ref_id":35101},{"sent":"blue","ref_id":35172},{"sent":"man in front with white shirt","ref_id":35206},{"sent":"second from left","ref_id":35207},{"sent":"green shirt","ref_id":35208},{"sent":"woman in green","ref_id":35209},{"sent":"girl","ref_id":35268},{"sent":"woman","ref_id":35269},{"sent":"bride","ref_id":35305},{"sent":"groom","ref_id":35306},{"sent":"man","ref_id":35319},{"sent":"woman in red","ref_id":35320},{"sent":"man in white","ref_id":35331},{"sent":"man in white shirt","ref_id":35332},{"sent":"guy on right","ref_id":35333},{"sent":"left person","ref_id":35407},{"sent":"woman on right","ref_id":35408},{"sent":"woman","ref_id":35409},{"sent":"umbrella on right","ref_id":35410},{"sent":"top left corner","ref_id":35411},{"sent":"girl in blue","ref_id":35615},{"sent":"woman in white","ref_id":35616},{"sent":"left dog","ref_id":35654},{"sent":"bottom left corner","ref_id":35655},{"sent":"left person","ref_id":35656},{"sent":"right dog","ref_id":35657},{"sent":"man in white shirt","ref_id":35710},{"sent":"woman in white","ref_id":35711},{"sent":"girl","ref_id":35739},{"sent":"guy on left","ref_id":35740},{"sent":"girl","ref_id":35786},{"sent":"girl","ref_id":35787},{"sent":"left guy","ref_id":35794},{"sent":"right girl","ref_id":35795},{"sent":"woman in black","ref_id":35837},{"sent":"left guy","ref_id":35848},{"sent":"right person","ref_id":35849},{"sent":"boy in middle","ref_id":35850},{"sent":"right girl","ref_id":35851},{"sent":"woman in red","ref_id":35965},{"sent":"woman in red","ref_id":35966},{"sent":"man in front","ref_id":35970},{"sent":"player in white","ref_id":35975},{"sent":"guy in blue","ref_id":35976},{"sent":"woman","ref_id":36051},{"sent":"baby","ref_id":36052},{"sent":"batter","ref_id":36160},{"sent":"catcher","ref_id":36161},{"sent":"umpire","ref_id":36162},{"sent":"kid","ref_id":36426},{"sent":"baby","ref_id":36427},{"sent":"woman","ref_id":36545},{"sent":"woman","ref_id":36546},{"sent":"woman on left","ref_id":36691},{"sent":"man on right","ref_id":36692},{"sent":"woman in black","ref_id":36693},{"sent":"man in blue shirt","ref_id":36694},{"sent":"woman on right","ref_id":36779},{"sent":"man","ref_id":36780},{"sent":"woman in back","ref_id":36880},{"sent":"woman in white","ref_id":36881},{"sent":"man on right","ref_id":36911},{"sent":"left person","ref_id":36912},{"sent":"man on left","ref_id":36913},{"sent":"left person","ref_id":36928},{"sent":"girl","ref_id":36929},{"sent":"girl","ref_id":36930},{"sent":"girl","ref_id":36931},{"sent":"batter","ref_id":36993},{"sent":"batter","ref_id":36994},{"sent":"right person","ref_id":36999},{"sent":"left guy","ref_id":37000},{"sent":"umpire","ref_id":37032},{"sent":"batter","ref_id":37033},{"sent":"woman","ref_id":37066},{"sent":"man","ref_id":37067},{"sent":"man on right","ref_id":37125},{"sent":"man","ref_id":37126},{"sent":"red shirt","ref_id":37250},{"sent":"girl","ref_id":37251},{"sent":"girl in blue","ref_id":37286},{"sent":"right guy","ref_id":37287},{"sent":"man on left","ref_id":37288},{"sent":"person in background","ref_id":37431},{"sent":"man","ref_id":37432},{"sent":"left hand","ref_id":37472},{"sent":"hand on right","ref_id":37473},{"sent":"man on right","ref_id":37478},{"sent":"woman","ref_id":37479},{"sent":"man in black shirt","ref_id":37598},{"sent":"man in white shirt","ref_id":37599},{"sent":"man in black shirt","ref_id":37600},{"sent":"kid","ref_id":37756},{"sent":"girl","ref_id":37757},{"sent":"baby","ref_id":37800},{"sent":"woman","ref_id":37801},{"sent":"woman on left","ref_id":37815},{"sent":"man in white shirt","ref_id":37816},{"sent":"man in white shirt","ref_id":37883},{"sent":"white shirt","ref_id":37884},{"sent":"man on right","ref_id":37974},{"sent":"woman","ref_id":37975},{"sent":"left girl","ref_id":38214},{"sent":"man on right","ref_id":38215},{"sent":"right UNK","ref_id":38216},{"sent":"man in black shirt","ref_id":38227},{"sent":"woman in white","ref_id":38228},{"sent":"left guy","ref_id":38274},{"sent":"man in white","ref_id":38275},{"sent":"person on right","ref_id":38276},{"sent":"man in white shirt","ref_id":38340},{"sent":"man in white shirt","ref_id":38341},{"sent":"man on right","ref_id":38390},{"sent":"left woman","ref_id":38391},{"sent":"right guy","ref_id":38392},{"sent":"right guy","ref_id":38417},{"sent":"guy in white","ref_id":38418},{"sent":"guy in white","ref_id":38419},{"sent":"man on right","ref_id":38446},{"sent":"second from right","ref_id":38447},{"sent":"man in middle","ref_id":38448},{"sent":"second from left","ref_id":38449},{"sent":"man on left","ref_id":38504},{"sent":"man on right","ref_id":38505},{"sent":"woman","ref_id":38506},{"sent":"right girl","ref_id":38544},{"sent":"woman on left","ref_id":38545},{"sent":"chair on left","ref_id":38546},{"sent":"chair on right","ref_id":38547},{"sent":"right person","ref_id":38587},{"sent":"man in middle","ref_id":38588},{"sent":"person in middle","ref_id":38589},{"sent":"man in white shirt","ref_id":38650},{"sent":"woman in black","ref_id":38651},{"sent":"left guy","ref_id":38654},{"sent":"man in red shirt","ref_id":38655},{"sent":"woman in black","ref_id":38656},{"sent":"red shirt","ref_id":38742},{"sent":"man in white shirt","ref_id":38743},{"sent":"woman in black","ref_id":38744},{"sent":"man","ref_id":38815},{"sent":"man","ref_id":38816},{"sent":"person on bike","ref_id":38856},{"sent":"person in red","ref_id":38857},{"sent":"left person","ref_id":38938},{"sent":"woman in front","ref_id":38939},{"sent":"white couch","ref_id":39139},{"sent":"woman in red","ref_id":39140},{"sent":"man in white","ref_id":39141},{"sent":"man in white shirt","ref_id":39142},{"sent":"man on right","ref_id":39180},{"sent":"man on right","ref_id":39181},{"sent":"man in white","ref_id":39182},{"sent":"man in white","ref_id":39183},{"sent":"man on right","ref_id":39298},{"sent":"man on left","ref_id":39299},{"sent":"left person","ref_id":39406},{"sent":"man","ref_id":39407},{"sent":"black shirt","ref_id":39550},{"sent":"laptop on right","ref_id":39551},{"sent":"black laptop","ref_id":39552},{"sent":"blue shirt","ref_id":39553},{"sent":"left laptop","ref_id":39554},{"sent":"left hand","ref_id":39555},{"sent":"guy in white","ref_id":39593},{"sent":"guy in red shirt","ref_id":39594},{"sent":"guy in white shirt","ref_id":39595},{"sent":"white shirt","ref_id":39596},{"sent":"kid in red","ref_id":39597},{"sent":"guy in middle","ref_id":39598},{"sent":"kid in blue","ref_id":39599},{"sent":"girl on right","ref_id":39600},{"sent":"man on left","ref_id":39635},{"sent":"man","ref_id":39636},{"sent":"person on right","ref_id":39644},{"sent":"woman","ref_id":39645},{"sent":"white car on right","ref_id":39646},{"sent":"baby","ref_id":39755},{"sent":"baby","ref_id":39756},{"sent":"man in black","ref_id":39839},{"sent":"kid in white","ref_id":39875},{"sent":"woman","ref_id":39876},{"sent":"baby","ref_id":39877},{"sent":"right person","ref_id":39909},{"sent":"person on right","ref_id":39910},{"sent":"woman in front","ref_id":39929},{"sent":"woman on right","ref_id":39930},{"sent":"woman in pink","ref_id":39931},{"sent":"girl in back","ref_id":40011},{"sent":"boy","ref_id":40012},{"sent":"person in black shirt on right","ref_id":40013},{"sent":"player on right","ref_id":40122},{"sent":"catcher","ref_id":40123},{"sent":"man in white","ref_id":40174},{"sent":"woman on left","ref_id":40175},{"sent":"woman in black","ref_id":40313},{"sent":"person on right","ref_id":40314},{"sent":"man on right","ref_id":40315},{"sent":"woman on left","ref_id":40316},{"sent":"woman in black","ref_id":40348},{"sent":"woman on left","ref_id":40349},{"sent":"man in white shirt","ref_id":40374},{"sent":"white hat","ref_id":40375},{"sent":"woman on left","ref_id":40376},{"sent":"player in white","ref_id":40390},{"sent":"man in white","ref_id":40391},{"sent":"right hot dog","ref_id":40496},{"sent":"woman in back","ref_id":40497},{"sent":"right pizza","ref_id":40498},{"sent":"girl in pink","ref_id":40499},{"sent":"man in white shirt","ref_id":40567},{"sent":"left person","ref_id":40568},{"sent":"girl on left","ref_id":40601},{"sent":"person in front","ref_id":40602},{"sent":"woman on right","ref_id":40603},{"sent":"umpire","ref_id":40633},{"sent":"batter","ref_id":40634},{"sent":"batter","ref_id":40635},{"sent":"car on right","ref_id":40695},{"sent":"right guy","ref_id":40696},{"sent":"kid in blue","ref_id":40725},{"sent":"catcher","ref_id":40726},{"sent":"man in middle","ref_id":40808},{"sent":"woman in black","ref_id":40809},{"sent":"woman on left","ref_id":40856},{"sent":"woman on right","ref_id":40857},{"sent":"woman","ref_id":40864},{"sent":"woman in white","ref_id":40865},{"sent":"woman","ref_id":40866},{"sent":"black area above the UNK","ref_id":40878},{"sent":"the man in the middle","ref_id":40879},{"sent":"man on right","ref_id":40880},{"sent":"white shirt","ref_id":40881},{"sent":"man in white","ref_id":41067},{"sent":"man in front","ref_id":41068},{"sent":"girl","ref_id":41212},{"sent":"banana","ref_id":41213},{"sent":"woman","ref_id":41214},{"sent":"woman","ref_id":41215},{"sent":"man on left","ref_id":41296},{"sent":"man","ref_id":41297},{"sent":"man","ref_id":41457},{"sent":"woman","ref_id":41458},{"sent":"woman","ref_id":41478},{"sent":"girl","ref_id":41479},{"sent":"man in red hat","ref_id":41679},{"sent":"man in white shirt","ref_id":41680},{"sent":"woman in blue shirt","ref_id":41681},{"sent":"left elephant","ref_id":41705},{"sent":"elephant on right","ref_id":41706},{"sent":"elephant in back","ref_id":41707},{"sent":"baby","ref_id":41708},{"sent":"baby","ref_id":41709},{"sent":"right person","ref_id":41812},{"sent":"woman in white","ref_id":41889},{"sent":"woman in white","ref_id":41890},{"sent":"person on left","ref_id":41899},{"sent":"man in white","ref_id":41900},{"sent":"woman on right","ref_id":42074},{"sent":"bottle on left","ref_id":42075},{"sent":"right bottle","ref_id":42076},{"sent":"bottle on left","ref_id":42077},{"sent":"man in blue","ref_id":42078},{"sent":"man on left","ref_id":42079},{"sent":"kid in red","ref_id":42080},{"sent":"person in white","ref_id":42189},{"sent":"kid","ref_id":42190},{"sent":"man in white","ref_id":42208},{"sent":"man in white shirt","ref_id":42209},{"sent":"guy in white shirt","ref_id":42257},{"sent":"guy in white shirt","ref_id":42258},{"sent":"guy on right","ref_id":42369},{"sent":"man in black shirt","ref_id":42370},{"sent":"woman","ref_id":42635},{"sent":"man","ref_id":42636},{"sent":"right guy","ref_id":42678},{"sent":"woman","ref_id":42679},{"sent":"right woman","ref_id":42896},{"sent":"person in black under umbrella","ref_id":42897},{"sent":"man on right","ref_id":43003},{"sent":"bottom left table","ref_id":43004},{"sent":"man in white","ref_id":43005},{"sent":"girl","ref_id":43088},{"sent":"woman","ref_id":43089},{"sent":"left guy","ref_id":43150},{"sent":"left person","ref_id":43151},{"sent":"right guy","ref_id":43152},{"sent":"person on right","ref_id":43153},{"sent":"right guy","ref_id":43162},{"sent":"bike on right","ref_id":43175},{"sent":"right blue","ref_id":43176},{"sent":"man on bike","ref_id":43177},{"sent":"front guy","ref_id":43178},{"sent":"bike on left","ref_id":43179},{"sent":"second bike from left","ref_id":43180},{"sent":"bike on right","ref_id":43181},{"sent":"baby","ref_id":43261},{"sent":"left person","ref_id":43262},{"sent":"baby","ref_id":43263},{"sent":"baby","ref_id":43264},{"sent":"table in front","ref_id":43288},{"sent":"woman in middle","ref_id":43289},{"sent":"girl on right","ref_id":43290},{"sent":"woman on left","ref_id":43291},{"sent":"middle chair","ref_id":43292},{"sent":"girl on right","ref_id":43298},{"sent":"woman","ref_id":43299},{"sent":"man in black","ref_id":43300},{"sent":"person in black","ref_id":43311},{"sent":"person in black","ref_id":43312},{"sent":"left girl","ref_id":43313},{"sent":"right girl","ref_id":43314},{"sent":"top left corner","ref_id":43315},{"sent":"person on left","ref_id":43316},{"sent":"right glass","ref_id":43317},{"sent":"glass on right","ref_id":43318},{"sent":"person in back","ref_id":43319},{"sent":"right glass","ref_id":43320},{"sent":"blue car","ref_id":43341},{"sent":"red shirt","ref_id":43342},{"sent":"person on right","ref_id":43343},{"sent":"white car","ref_id":43344},{"sent":"woman","ref_id":43379},{"sent":"man","ref_id":43380},{"sent":"man in white","ref_id":43535},{"sent":"person on left","ref_id":43536},{"sent":"woman in black","ref_id":43537},{"sent":"person on left","ref_id":43538},{"sent":"man in front","ref_id":43539},{"sent":"left bear","ref_id":43647},{"sent":"woman","ref_id":43648},{"sent":"man in blue","ref_id":43944},{"sent":"man in middle","ref_id":43945},{"sent":"person on right","ref_id":43948},{"sent":"woman","ref_id":43949},{"sent":"red shirt","ref_id":43996},{"sent":"guy in blue","ref_id":44027},{"sent":"tennis player","ref_id":44028},{"sent":"right guy","ref_id":44050},{"sent":"man in black","ref_id":44051},{"sent":"kid in red","ref_id":44052},{"sent":"kid in red","ref_id":44357},{"sent":"man in white shirt","ref_id":44358},{"sent":"person in background","ref_id":44417},{"sent":"woman","ref_id":44418},{"sent":"woman","ref_id":44448},{"sent":"man","ref_id":44449},{"sent":"man on left","ref_id":44517},{"sent":"girl","ref_id":44518},{"sent":"woman","ref_id":44519},{"sent":"woman in purple","ref_id":44553},{"sent":"right person","ref_id":44554},{"sent":"red shirt","ref_id":44582},{"sent":"girl in blue shirt","ref_id":44583},{"sent":"guy in black shirt","ref_id":44584},{"sent":"woman in white","ref_id":44628},{"sent":"man on left","ref_id":44629},{"sent":"woman in white","ref_id":44630},{"sent":"girl in middle","ref_id":44633},{"sent":"man on left","ref_id":44634},{"sent":"woman in back","ref_id":44635},{"sent":"left person","ref_id":44642},{"sent":"woman","ref_id":44643},{"sent":"guy in red","ref_id":44644},{"sent":"person on left","ref_id":44645},{"sent":"guy in white","ref_id":44699},{"sent":"right hand","ref_id":44700},{"sent":"man on right","ref_id":44714},{"sent":"woman on left","ref_id":44715},{"sent":"man in black","ref_id":44732},{"sent":"man in red","ref_id":44733},{"sent":"the woman","ref_id":44740},{"sent":"hand","ref_id":44741},{"sent":"man","ref_id":44766},{"sent":"batter","ref_id":45019},{"sent":"catcher","ref_id":45020},{"sent":"red shirt","ref_id":45021},{"sent":"girl in white shirt","ref_id":45046},{"sent":"person on right","ref_id":45047},{"sent":"woman in white","ref_id":45048},{"sent":"left girl","ref_id":45049},{"sent":"guy on left","ref_id":45050},{"sent":"girl in white","ref_id":45051},{"sent":"girl in white","ref_id":45052},{"sent":"catcher","ref_id":45300},{"sent":"batter","ref_id":45301},{"sent":"guy on right","ref_id":45340},{"sent":"woman on left","ref_id":45341},{"sent":"woman in black dress","ref_id":45342},{"sent":"person on right","ref_id":45358},{"sent":"person in middle","ref_id":45359},{"sent":"man on right","ref_id":45367},{"sent":"woman in black","ref_id":45368},{"sent":"woman on right","ref_id":45369},{"sent":"person on left","ref_id":45407},{"sent":"woman in white","ref_id":45434},{"sent":"man in red","ref_id":45435},{"sent":"woman in black","ref_id":45436},{"sent":"guy on bike","ref_id":45600},{"sent":"man","ref_id":45601},{"sent":"man on left","ref_id":45675},{"sent":"man on left","ref_id":45676},{"sent":"man in black","ref_id":45677},{"sent":"woman on left","ref_id":45837},{"sent":"man on left","ref_id":45838},{"sent":"girl on right","ref_id":45839},{"sent":"white shirt","ref_id":45840},{"sent":"woman in red","ref_id":45841},{"sent":"right guy","ref_id":45863},{"sent":"left guy","ref_id":45864},{"sent":"guy in black","ref_id":45865},{"sent":"woman","ref_id":45966},{"sent":"woman","ref_id":45967},{"sent":"catcher","ref_id":46080},{"sent":"batter","ref_id":46081},{"sent":"umpire","ref_id":46165},{"sent":"catcher","ref_id":46166},{"sent":"woman in front","ref_id":46208},{"sent":"woman in front","ref_id":46209},{"sent":"person under umbrella","ref_id":46210},{"sent":"person on left","ref_id":46211},{"sent":"man in white","ref_id":46285},{"sent":"man in red hat","ref_id":46286},{"sent":"man in blue shirt","ref_id":46321},{"sent":"man on right","ref_id":46322},{"sent":"girl in white shirt on right","ref_id":46350},{"sent":"woman in white on right","ref_id":46351},{"sent":"woman in front with hat","ref_id":46352},{"sent":"woman in red","ref_id":46353},{"sent":"woman","ref_id":46393},{"sent":"right arm","ref_id":46394},{"sent":"top right corner","ref_id":46403},{"sent":"person in back","ref_id":46404},{"sent":"kid","ref_id":46405},{"sent":"person in background","ref_id":46451},{"sent":"person on right","ref_id":46452},{"sent":"woman","ref_id":46453},{"sent":"girl on right","ref_id":46555},{"sent":"arm on left","ref_id":46556},{"sent":"man in black shirt","ref_id":46581},{"sent":"man","ref_id":46582},{"sent":"person on left","ref_id":46672},{"sent":"man in black","ref_id":46673},{"sent":"man in middle with glasses","ref_id":46678},{"sent":"woman in white","ref_id":46679},{"sent":"bottom right corner","ref_id":46680},{"sent":"woman on right with black hair","ref_id":46681},{"sent":"woman in front with black hair","ref_id":46682},{"sent":"man in front with glasses","ref_id":46683},{"sent":"man on left","ref_id":46823},{"sent":"woman","ref_id":46824},{"sent":"girl in blue shirt","ref_id":46834},{"sent":"girl in black","ref_id":46835},{"sent":"person on right","ref_id":46836},{"sent":"girl in white shirt","ref_id":46837},{"sent":"groom","ref_id":46838},{"sent":"bride","ref_id":46839},{"sent":"right guy","ref_id":46880},{"sent":"man in white","ref_id":46881},{"sent":"woman on right","ref_id":46938},{"sent":"man on left","ref_id":46939},{"sent":"bottom right bowl","ref_id":46940},{"sent":"glass on left","ref_id":46941},{"sent":"left kid","ref_id":46949},{"sent":"man on right","ref_id":46950},{"sent":"woman in black","ref_id":46951},{"sent":"right person","ref_id":47014},{"sent":"left person","ref_id":47015},{"sent":"man on right","ref_id":47077},{"sent":"man on right","ref_id":47092},{"sent":"man in white shirt","ref_id":47093},{"sent":"man in black","ref_id":47094},{"sent":"guy in white shirt","ref_id":47164},{"sent":"man in white","ref_id":47165},{"sent":"woman in black","ref_id":47319},{"sent":"man in white","ref_id":47391},{"sent":"bottom right corner","ref_id":47392},{"sent":"bottom left person","ref_id":47393},{"sent":"man in black","ref_id":47394},{"sent":"man in middle","ref_id":47441},{"sent":"man in black","ref_id":47442},{"sent":"right guy","ref_id":47443},{"sent":"right guy","ref_id":47446},{"sent":"guy in black","ref_id":47447},{"sent":"person in front","ref_id":47519},{"sent":"woman in front","ref_id":47520},{"sent":"person on right","ref_id":47521},{"sent":"person on left","ref_id":47522},{"sent":"white car","ref_id":47566},{"sent":"woman","ref_id":47567},{"sent":"white shirt","ref_id":47568},{"sent":"right bus","ref_id":47569},{"sent":"woman in black","ref_id":47682},{"sent":"man in suit on right","ref_id":47683},{"sent":"woman in black","ref_id":47684},{"sent":"white shirt left","ref_id":47685},{"sent":"woman in black on left","ref_id":47686},{"sent":"man in black suit","ref_id":47687},{"sent":"man on right","ref_id":47757},{"sent":"woman","ref_id":47758},{"sent":"person on right","ref_id":47777},{"sent":"person on left","ref_id":47778},{"sent":"woman in white","ref_id":47779},{"sent":"woman in red","ref_id":47780},{"sent":"right player","ref_id":47885},{"sent":"left girl","ref_id":47886},{"sent":"right girl","ref_id":47887},{"sent":"left player","ref_id":47888},{"sent":"person in middle","ref_id":47999},{"sent":"person in front","ref_id":48000},{"sent":"right person","ref_id":48001},{"sent":"person in background on left","ref_id":48006},{"sent":"batter","ref_id":48007},{"sent":"man in black","ref_id":48132},{"sent":"woman in blue","ref_id":48133},{"sent":"pizza on right","ref_id":48134},{"sent":"pizza","ref_id":48135},{"sent":"girl","ref_id":48147},{"sent":"girl on right","ref_id":48148},{"sent":"catcher","ref_id":48151},{"sent":"umpire","ref_id":48152},{"sent":"batter","ref_id":48153},{"sent":"man in black shirt","ref_id":48201},{"sent":"man","ref_id":48202},{"sent":"person in front","ref_id":48304},{"sent":"person in middle","ref_id":48305},{"sent":"player in white","ref_id":48322},{"sent":"player","ref_id":48323},{"sent":"man","ref_id":48363},{"sent":"man in black","ref_id":48364},{"sent":"woman","ref_id":48365},{"sent":"man on left","ref_id":48374},{"sent":"right elephant","ref_id":48375},{"sent":"right guy","ref_id":48473},{"sent":"man in middle","ref_id":48474},{"sent":"black umbrella","ref_id":48475},{"sent":"right horse","ref_id":48476},{"sent":"man on left","ref_id":48477},{"sent":"man on left","ref_id":48478},{"sent":"batter","ref_id":48571},{"sent":"catcher","ref_id":48572},{"sent":"girl in blue","ref_id":48610},{"sent":"girl in white","ref_id":48611},{"sent":"man in white shirt","ref_id":48705},{"sent":"woman on right","ref_id":48706},{"sent":"woman in black","ref_id":48707},{"sent":"left player","ref_id":48745},{"sent":"right player","ref_id":48746},{"sent":"man","ref_id":48790},{"sent":"man on right","ref_id":48791},{"sent":"woman","ref_id":48983},{"sent":"man on right","ref_id":48984},{"sent":"man in black","ref_id":49014},{"sent":"guy in background","ref_id":49015},{"sent":"batter","ref_id":49016},{"sent":"player in black","ref_id":49199},{"sent":"player in red","ref_id":49200},{"sent":"player in red","ref_id":49201},{"sent":"man","ref_id":49312},{"sent":"man","ref_id":49313},{"sent":"person on left","ref_id":49315},{"sent":"person in middle","ref_id":49316},{"sent":"head on left","ref_id":49373},{"sent":"person on right","ref_id":49374},{"sent":"man in white","ref_id":49375},{"sent":"woman","ref_id":49457},{"sent":"man","ref_id":49458},{"sent":"woman","ref_id":49538},{"sent":"woman","ref_id":49539},{"sent":"right leg","ref_id":49600},{"sent":"left leg","ref_id":49601},{"sent":"person on right","ref_id":49606},{"sent":"batter","ref_id":49607},{"sent":"person on right","ref_id":49620},{"sent":"person on left","ref_id":49621},{"sent":"man","ref_id":49622},{"sent":"man in black shirt","ref_id":49638},{"sent":"man in blue","ref_id":49639},{"sent":"woman on right","ref_id":49747},{"sent":"woman","ref_id":49748},{"sent":"woman in white","ref_id":49833},{"sent":"woman in black","ref_id":49834},{"sent":"guy in black shirt","ref_id":49932},{"sent":"white shirt","ref_id":49933},{"sent":"man in black","ref_id":47},{"sent":"person on right","ref_id":109},{"sent":"woman in red","ref_id":110},{"sent":"car behind bike","ref_id":111},{"sent":"car on left","ref_id":112},{"sent":"man in blue","ref_id":382},{"sent":"man in white","ref_id":383},{"sent":"left person","ref_id":519},{"sent":"man on right","ref_id":520}]} \ No newline at end of file diff --git a/tools/refer/test/sample_expressions_testB.json b/tools/refer/test/sample_expressions_testB.json new file mode 100644 index 0000000..eab973c --- /dev/null +++ b/tools/refer/test/sample_expressions_testB.json @@ -0,0 +1 @@ +{"predictions":[{"sent":"car on left","ref_id":25},{"sent":"car on left","ref_id":26},{"sent":"top sandwich","ref_id":27},{"sent":"top left donut","ref_id":28},{"sent":"zebra on left","ref_id":45},{"sent":"right zebra","ref_id":46},{"sent":"chair in front of man","ref_id":164},{"sent":"bottom right corner","ref_id":165},{"sent":"left chair","ref_id":166},{"sent":"top right corner","ref_id":232},{"sent":"pizza in front","ref_id":233},{"sent":"glass in back","ref_id":234},{"sent":"left glass","ref_id":235},{"sent":"yellow fruit on left","ref_id":259},{"sent":"apple in front","ref_id":260},{"sent":"yellow apple","ref_id":261},{"sent":"orange in the middle","ref_id":262},{"sent":"bottom right orange","ref_id":285},{"sent":"bottom left apple","ref_id":286},{"sent":"bottom right green apple","ref_id":287},{"sent":"second row from bottom right","ref_id":288},{"sent":"white bear","ref_id":299},{"sent":"brown bear","ref_id":300},{"sent":"red vase","ref_id":326},{"sent":"red vase","ref_id":327},{"sent":"vase","ref_id":328},{"sent":"glass on left","ref_id":360},{"sent":"glass of beer","ref_id":361},{"sent":"bottle on right","ref_id":362},{"sent":"bottle on left","ref_id":363},{"sent":"bottle of wine bottle on left","ref_id":364},{"sent":"right horse","ref_id":435},{"sent":"left horse","ref_id":436},{"sent":"train on right","ref_id":474},{"sent":"train on left","ref_id":475},{"sent":"right elephant","ref_id":545},{"sent":"boat on right","ref_id":605},{"sent":"white car","ref_id":629},{"sent":"white car","ref_id":630},{"sent":"left bed","ref_id":668},{"sent":"bed","ref_id":669},{"sent":"right bike","ref_id":677},{"sent":"left bike","ref_id":678},{"sent":"left table","ref_id":721},{"sent":"table","ref_id":722},{"sent":"traffic light","ref_id":837},{"sent":"traffic light","ref_id":838},{"sent":"front bike","ref_id":855},{"sent":"front bike","ref_id":856},{"sent":"blue tie","ref_id":923},{"sent":"left tie","ref_id":924},{"sent":"right tie","ref_id":925},{"sent":"red tie","ref_id":926},{"sent":"monitor on right","ref_id":940},{"sent":"monitor on right","ref_id":941},{"sent":"pizza in front","ref_id":1155},{"sent":"pizza slice","ref_id":1156},{"sent":"bottom left bananas","ref_id":1218},{"sent":"top left bananas","ref_id":1219},{"sent":"pizza in front","ref_id":1227},{"sent":"pizza in front","ref_id":1228},{"sent":"baby elephant","ref_id":1256},{"sent":"big elephant","ref_id":1257},{"sent":"right orange","ref_id":1273},{"sent":"top orange","ref_id":1274},{"sent":"horse on left","ref_id":1283},{"sent":"left fridge","ref_id":1339},{"sent":"fridge in front of the fridge","ref_id":1340},{"sent":"right cow","ref_id":1368},{"sent":"white truck","ref_id":1644},{"sent":"white car","ref_id":1645},{"sent":"broccoli in front","ref_id":1776},{"sent":"broccoli on right","ref_id":1777},{"sent":"second row from right","ref_id":1865},{"sent":"top middle sandwich","ref_id":1866},{"sent":"right most sandwich","ref_id":1867},{"sent":"left sandwich","ref_id":1868},{"sent":"left most sandwich","ref_id":1869},{"sent":"middle row second from right","ref_id":1870},{"sent":"second from right","ref_id":1871},{"sent":"elephant on right","ref_id":2033},{"sent":"elephant","ref_id":2034},{"sent":"second from left","ref_id":2103},{"sent":"right most yellow","ref_id":2104},{"sent":"second row from right","ref_id":2105},{"sent":"top right donut","ref_id":2122},{"sent":"bottom left donut","ref_id":2123},{"sent":"bottom left donut","ref_id":2124},{"sent":"middle donut","ref_id":2125},{"sent":"right donut","ref_id":2126},{"sent":"top right dessert","ref_id":2370},{"sent":"middle dessert","ref_id":2371},{"sent":"bowl","ref_id":2392},{"sent":"left bowl","ref_id":2393},{"sent":"bus on right","ref_id":2467},{"sent":"bus in front","ref_id":2468},{"sent":"left monitor","ref_id":2540},{"sent":"right monitor","ref_id":2541},{"sent":"blurry food in back","ref_id":2576},{"sent":"bottom right corner","ref_id":2577},{"sent":"glass in front of the woman","ref_id":2578},{"sent":"glass on left","ref_id":2579},{"sent":"plant on right","ref_id":2642},{"sent":"top right corner","ref_id":2643},{"sent":"green plant","ref_id":2644},{"sent":"bike on right","ref_id":2692},{"sent":"bike on the right","ref_id":2693},{"sent":"bike on left","ref_id":2694},{"sent":"bottom right red","ref_id":2738},{"sent":"bottom left UNK","ref_id":2739},{"sent":"left cat","ref_id":2857},{"sent":"cat on right","ref_id":2858},{"sent":"left elephant","ref_id":2937},{"sent":"right elephant","ref_id":2938},{"sent":"elephant on right","ref_id":2939},{"sent":"left person","ref_id":2944},{"sent":"person on left","ref_id":2945},{"sent":"top left hot dog","ref_id":2946},{"sent":"sandwich on left","ref_id":2947},{"sent":"front sandwich","ref_id":2948},{"sent":"top right sandwich","ref_id":2949},{"sent":"top sandwich","ref_id":2950},{"sent":"right","ref_id":2960},{"sent":"white and white","ref_id":2961},{"sent":"right train","ref_id":2962},{"sent":"right train","ref_id":2963},{"sent":"plant in the middle","ref_id":3028},{"sent":"right plant","ref_id":3029},{"sent":"left umbrella","ref_id":3125},{"sent":"umbrella","ref_id":3126},{"sent":"second from left","ref_id":3224},{"sent":"right box","ref_id":3225},{"sent":"left UNK","ref_id":3226},{"sent":"left horse","ref_id":3303},{"sent":"horse in front","ref_id":3304},{"sent":"bird on right","ref_id":3403},{"sent":"bird on left","ref_id":3404},{"sent":"table","ref_id":3656},{"sent":"left UNK","ref_id":3657},{"sent":"chair on right","ref_id":3844},{"sent":"top right bunk","ref_id":3845},{"sent":"left bear","ref_id":3875},{"sent":"right bear","ref_id":3876},{"sent":"top suitcase","ref_id":3910},{"sent":"right suitcase","ref_id":3911},{"sent":"bottom right suitcase","ref_id":3912},{"sent":"couch on left","ref_id":3919},{"sent":"right couch","ref_id":3920},{"sent":"left wine bottle","ref_id":3931},{"sent":"right bottle","ref_id":3932},{"sent":"top layer","ref_id":3941},{"sent":"front","ref_id":3942},{"sent":"bear on left","ref_id":3950},{"sent":"bear on right","ref_id":3951},{"sent":"bear on right","ref_id":3952},{"sent":"big bear","ref_id":3954},{"sent":"bottom left bear","ref_id":3955},{"sent":"bottom left suitcase","ref_id":4004},{"sent":"top right corner","ref_id":4005},{"sent":"left sandwich","ref_id":4021},{"sent":"sandwich on left","ref_id":4022},{"sent":"right bottle","ref_id":4072},{"sent":"second from left","ref_id":4073},{"sent":"UNK bottle","ref_id":4074},{"sent":"UNK","ref_id":4075},{"sent":"zebra on left","ref_id":4329},{"sent":"zebra in front","ref_id":4330},{"sent":"zebra on left","ref_id":4500},{"sent":"right zebra","ref_id":4501},{"sent":"glass on left","ref_id":4724},{"sent":"bus on right","ref_id":4806},{"sent":"bottom black","ref_id":4822},{"sent":"black suitcase on right","ref_id":4823},{"sent":"left chair","ref_id":4911},{"sent":"boat on left","ref_id":4912},{"sent":"top right corner","ref_id":4915},{"sent":"top left donut","ref_id":4916},{"sent":"top middle donut","ref_id":4917},{"sent":"bottom right donut","ref_id":4918},{"sent":"middle donut","ref_id":4919},{"sent":"top left apple","ref_id":4925},{"sent":"orange on right","ref_id":4926},{"sent":"orange in middle","ref_id":4927},{"sent":"middle apple","ref_id":4928},{"sent":"bottom right carrot","ref_id":4981},{"sent":"orange carrot","ref_id":4982},{"sent":"bottom right corner","ref_id":4987},{"sent":"car bottom left","ref_id":4988},{"sent":"top left corner","ref_id":5000},{"sent":"the cat","ref_id":5001},{"sent":"bottom right sheep","ref_id":5037},{"sent":"left sheep","ref_id":5038},{"sent":"sheep in front","ref_id":5039},{"sent":"right giraffe","ref_id":5040},{"sent":"right giraffe","ref_id":5041},{"sent":"right remote","ref_id":5072},{"sent":"left remote","ref_id":5073},{"sent":"white bowl of food","ref_id":5074},{"sent":"hot dog","ref_id":5075},{"sent":"top right slice","ref_id":5116},{"sent":"left sandwich","ref_id":5117},{"sent":"bottom left food","ref_id":5178},{"sent":"pizza on right","ref_id":5179},{"sent":"bowl of food on left","ref_id":5180},{"sent":"left bowl","ref_id":5181},{"sent":"left monitor","ref_id":5242},{"sent":"right monitor","ref_id":5243},{"sent":"right cow","ref_id":5298},{"sent":"cow on left","ref_id":5299},{"sent":"left horse","ref_id":5327},{"sent":"horse in front","ref_id":5328},{"sent":"horse on right","ref_id":5329},{"sent":"elephant in middle","ref_id":5340},{"sent":"left elephant","ref_id":5341},{"sent":"white car","ref_id":5491},{"sent":"white car","ref_id":5492},{"sent":"yellow car","ref_id":5493},{"sent":"top middle brown bear","ref_id":5521},{"sent":"bear on right","ref_id":5522},{"sent":"right bear","ref_id":5523},{"sent":"top right bear","ref_id":5524},{"sent":"top right bear","ref_id":5525},{"sent":"bear on left","ref_id":5526},{"sent":"bear in middle","ref_id":5527},{"sent":"toilet on left","ref_id":5645},{"sent":"chair in front","ref_id":5646},{"sent":"right sheep","ref_id":5669},{"sent":"left sheep","ref_id":5670},{"sent":"left chair","ref_id":5694},{"sent":"right bed","ref_id":5695},{"sent":"right train","ref_id":5797},{"sent":"right train","ref_id":5798},{"sent":"right slice","ref_id":5809},{"sent":"top left donut","ref_id":5810},{"sent":"white car","ref_id":5829},{"sent":"right white bus","ref_id":5830},{"sent":"white truck","ref_id":5831},{"sent":"donut on right","ref_id":5967},{"sent":"bottom right donut","ref_id":5968},{"sent":"right donut","ref_id":5969},{"sent":"donut on the right","ref_id":5970},{"sent":"left umbrella","ref_id":6053},{"sent":"middle banana","ref_id":6054},{"sent":"left bear","ref_id":6055},{"sent":"left UNK","ref_id":6056},{"sent":"left sheep","ref_id":6258},{"sent":"sheep in middle","ref_id":6259},{"sent":"right sheep","ref_id":6260},{"sent":"middle bowl","ref_id":6278},{"sent":"top left food","ref_id":6279},{"sent":"top left tray","ref_id":6280},{"sent":"top right bread","ref_id":6281},{"sent":"bottom left","ref_id":6282},{"sent":"right bread","ref_id":6283},{"sent":"bottom left bread","ref_id":6284},{"sent":"bottom left bowl","ref_id":6312},{"sent":"green apple","ref_id":6313},{"sent":"right clock","ref_id":6843},{"sent":"left clock","ref_id":6844},{"sent":"left bed","ref_id":6927},{"sent":"right bed","ref_id":6928},{"sent":"bottom left corner","ref_id":6929},{"sent":"bed","ref_id":6974},{"sent":"bed on right","ref_id":6975},{"sent":"top right broccoli","ref_id":7143},{"sent":"broccoli in front","ref_id":7144},{"sent":"broccoli on right","ref_id":7159},{"sent":"broccoli on left","ref_id":7160},{"sent":"UNK","ref_id":7183},{"sent":"UNK","ref_id":7184},{"sent":"left cow","ref_id":7252},{"sent":"right cow","ref_id":7253},{"sent":"bottom right corner","ref_id":7268},{"sent":"front bike","ref_id":7269},{"sent":"left suitcase","ref_id":7316},{"sent":"right suitcase","ref_id":7317},{"sent":"black suitcase","ref_id":7318},{"sent":"second suitcase from left","ref_id":7319},{"sent":"second bus from right","ref_id":7636},{"sent":"right bus","ref_id":7637},{"sent":"right duck","ref_id":7645},{"sent":"left duck","ref_id":7646},{"sent":"left duck","ref_id":7647},{"sent":"umbrella on left","ref_id":7801},{"sent":"right umbrella","ref_id":7802},{"sent":"broccoli on left","ref_id":7812},{"sent":"broccoli on the right","ref_id":7813},{"sent":"top bear","ref_id":7838},{"sent":"left bear","ref_id":7839},{"sent":"right bear","ref_id":7840},{"sent":"right horse","ref_id":7895},{"sent":"left dog","ref_id":7896},{"sent":"right slice","ref_id":7916},{"sent":"pizza slice","ref_id":7917},{"sent":"elephant on left","ref_id":8085},{"sent":"bear on left","ref_id":8128},{"sent":"left couch","ref_id":8210},{"sent":"right couch","ref_id":8211},{"sent":"right monitor","ref_id":8352},{"sent":"top left monitor","ref_id":8353},{"sent":"top left banana","ref_id":8380},{"sent":"banana in middle","ref_id":8381},{"sent":"banana on left","ref_id":8382},{"sent":"bowl of soup","ref_id":8649},{"sent":"bowl of food on left","ref_id":8650},{"sent":"white car","ref_id":8681},{"sent":"top left corner","ref_id":8729},{"sent":"glass","ref_id":8781},{"sent":"drink","ref_id":8782},{"sent":"left cow","ref_id":8788},{"sent":"left sheep","ref_id":8789},{"sent":"right sheep","ref_id":8790},{"sent":"top left sheep","ref_id":8806},{"sent":"sheep","ref_id":8807},{"sent":"top left donut","ref_id":8855},{"sent":"top left donut","ref_id":8856},{"sent":"donut on left","ref_id":8857},{"sent":"top right donut","ref_id":8858},{"sent":"front bike","ref_id":8909},{"sent":"left bike","ref_id":8910},{"sent":"right duck","ref_id":9214},{"sent":"left duck","ref_id":9215},{"sent":"top duck","ref_id":9216},{"sent":"right bus","ref_id":9233},{"sent":"bus in front","ref_id":9234},{"sent":"donut in front","ref_id":9295},{"sent":"donut on right","ref_id":9296},{"sent":"bottom right donut","ref_id":9305},{"sent":"bottom donut","ref_id":9306},{"sent":"donut in front","ref_id":9307},{"sent":"donut in front","ref_id":9308},{"sent":"left umbrella","ref_id":9423},{"sent":"left umbrella","ref_id":9424},{"sent":"top right umbrella","ref_id":9425},{"sent":"right umbrella","ref_id":9426},{"sent":"left plane","ref_id":9446},{"sent":"plane","ref_id":9447},{"sent":"right giraffe","ref_id":9560},{"sent":"left giraffe","ref_id":9561},{"sent":"hand","ref_id":9574},{"sent":"hand","ref_id":9575},{"sent":"bottom left apple","ref_id":9576},{"sent":"right giraffe","ref_id":9598},{"sent":"giraffe on left","ref_id":9599},{"sent":"black suitcase","ref_id":9628},{"sent":"elephant in front","ref_id":9629},{"sent":"elephant in front","ref_id":9630},{"sent":"elephant on right","ref_id":9631},{"sent":"right couch","ref_id":9707},{"sent":"couch","ref_id":9708},{"sent":"right horse","ref_id":9836},{"sent":"horse on left","ref_id":9837},{"sent":"horse on left","ref_id":9919},{"sent":"white horse","ref_id":9920},{"sent":"horse in front","ref_id":9921},{"sent":"bear on left","ref_id":10035},{"sent":"bear on left","ref_id":10036},{"sent":"right sandwich","ref_id":10110},{"sent":"left hot dog","ref_id":10111},{"sent":"elephant on right","ref_id":10239},{"sent":"elephant on right","ref_id":10240},{"sent":"elephant on left","ref_id":10241},{"sent":"left bike","ref_id":10380},{"sent":"motorcycle","ref_id":10381},{"sent":"white car","ref_id":10382},{"sent":"white car","ref_id":10383},{"sent":"right bird","ref_id":10601},{"sent":"left duck","ref_id":10602},{"sent":"red laptop","ref_id":10795},{"sent":"red and white UNK","ref_id":10796},{"sent":"cat on right","ref_id":10847},{"sent":"chair on left","ref_id":10907},{"sent":"chair in front of woman","ref_id":10908},{"sent":"orange on left","ref_id":11114},{"sent":"orange on top right","ref_id":11115},{"sent":"truck on right","ref_id":11131},{"sent":"truck on left","ref_id":11132},{"sent":"right piece of broccoli","ref_id":11192},{"sent":"right piece of food","ref_id":11193},{"sent":"glass on right","ref_id":11281},{"sent":"glass in front of wine glass","ref_id":11282},{"sent":"left couch","ref_id":11328},{"sent":"right black couch","ref_id":11329},{"sent":"left couch","ref_id":11330},{"sent":"right chair","ref_id":11331},{"sent":"bottom row second from left","ref_id":11982},{"sent":"bottom left donut","ref_id":11983},{"sent":"top row second from left","ref_id":11984},{"sent":"top row second from right","ref_id":11985},{"sent":"top right donut","ref_id":11986},{"sent":"top right donut","ref_id":11987},{"sent":"second row from right","ref_id":11988},{"sent":"top right donut","ref_id":11989},{"sent":"bottom left donut","ref_id":11990},{"sent":"middle row second from right","ref_id":11991},{"sent":"middle row second from right","ref_id":11992},{"sent":"bottom right donut","ref_id":11993},{"sent":"cow on right","ref_id":12041},{"sent":"cow on right","ref_id":12042},{"sent":"left cow","ref_id":12043},{"sent":"right bear","ref_id":12064},{"sent":"left bear","ref_id":12065},{"sent":"right bear","ref_id":12066},{"sent":"left bear","ref_id":12067},{"sent":"blue bike","ref_id":12106},{"sent":"front bike","ref_id":12107},{"sent":"middle row second from right","ref_id":12134},{"sent":"top left donut","ref_id":12135},{"sent":"middle row second from left","ref_id":12136},{"sent":"middle row second from left","ref_id":12137},{"sent":"bottom left donut","ref_id":12138},{"sent":"middle row","ref_id":12139},{"sent":"middle row second from right","ref_id":12140},{"sent":"right zebra","ref_id":12181},{"sent":"zebra on left","ref_id":12182},{"sent":"horse on right","ref_id":12239},{"sent":"horse on left","ref_id":12240},{"sent":"right","ref_id":12299},{"sent":"middle","ref_id":12300},{"sent":"the little girl","ref_id":12394},{"sent":"bottom right corner","ref_id":12395},{"sent":"left giraffe","ref_id":12407},{"sent":"right giraffe","ref_id":12408},{"sent":"left bus","ref_id":12421},{"sent":"right bus","ref_id":12422},{"sent":"right bus","ref_id":12423},{"sent":"left bike","ref_id":12464},{"sent":"bike on right","ref_id":12465},{"sent":"bike in front","ref_id":12466},{"sent":"bench","ref_id":12539},{"sent":"table","ref_id":12540},{"sent":"right boat","ref_id":12681},{"sent":"left boat","ref_id":12682},{"sent":"left cow","ref_id":12792},{"sent":"cow in front","ref_id":12793},{"sent":"right glass","ref_id":12899},{"sent":"left glass","ref_id":12900},{"sent":"second glass from left","ref_id":12901},{"sent":"middle glass","ref_id":12902},{"sent":"right giraffe","ref_id":12983},{"sent":"middle giraffe","ref_id":12984},{"sent":"right bus","ref_id":13072},{"sent":"left bus","ref_id":13073},{"sent":"top right bear","ref_id":13080},{"sent":"top bear","ref_id":13081},{"sent":"right chair","ref_id":13100},{"sent":"bottom right corner","ref_id":13101},{"sent":"cake in front","ref_id":13110},{"sent":"middle donut","ref_id":13111},{"sent":"bottom clock","ref_id":13188},{"sent":"clock on right","ref_id":13189},{"sent":"middle bear","ref_id":13298},{"sent":"left bear","ref_id":13299},{"sent":"bottom right dish","ref_id":13338},{"sent":"bottom left dish","ref_id":13339},{"sent":"bottom left bowl","ref_id":13340},{"sent":"bottom right bowl","ref_id":13341},{"sent":"bottom right bowl","ref_id":13342},{"sent":"top right bowl","ref_id":13343},{"sent":"top right container","ref_id":13344},{"sent":"bottom left bowl","ref_id":13345},{"sent":"white bird on right","ref_id":13370},{"sent":"middle duck","ref_id":13371},{"sent":"right bus","ref_id":13450},{"sent":"bus in front","ref_id":13451},{"sent":"motorcycle in front","ref_id":13459},{"sent":"motorcycle on left","ref_id":13460},{"sent":"white car on right","ref_id":13461},{"sent":"cow in front","ref_id":13504},{"sent":"right horse","ref_id":13505},{"sent":"cow on left","ref_id":13506},{"sent":"white car top left","ref_id":13511},{"sent":"blue bus","ref_id":13512},{"sent":"bus in middle","ref_id":13520},{"sent":"right bus","ref_id":13521},{"sent":"giraffe in front","ref_id":13658},{"sent":"giraffe on left","ref_id":13659},{"sent":"left monitor","ref_id":13708},{"sent":"couch on right","ref_id":13768},{"sent":"couch on left","ref_id":13769},{"sent":"left racket","ref_id":13819},{"sent":"right racket","ref_id":13820},{"sent":"hand on left","ref_id":13825},{"sent":"top left corner","ref_id":13826},{"sent":"top donut","ref_id":13827},{"sent":"left bus","ref_id":13830},{"sent":"bus on right","ref_id":13831},{"sent":"red car","ref_id":13832},{"sent":"chair on right","ref_id":13851},{"sent":"chair on left","ref_id":13852},{"sent":"banana on top","ref_id":13871},{"sent":"top banana","ref_id":13872},{"sent":"banana","ref_id":13873},{"sent":"top banana","ref_id":13874},{"sent":"bottom left sandwich","ref_id":14098},{"sent":"left sandwich","ref_id":14099},{"sent":"bananas in front","ref_id":14184},{"sent":"bananas","ref_id":14185},{"sent":"right zebra","ref_id":14283},{"sent":"zebra in the middle","ref_id":14284},{"sent":"top clock","ref_id":14470},{"sent":"bottom clock","ref_id":14471},{"sent":"right horse","ref_id":14509},{"sent":"left cow","ref_id":14510},{"sent":"red book","ref_id":14551},{"sent":"green book","ref_id":14552},{"sent":"zebra on right","ref_id":14652},{"sent":"zebra in front","ref_id":14653},{"sent":"right bird","ref_id":14665},{"sent":"left bird","ref_id":14666},{"sent":"orange","ref_id":14727},{"sent":"orange on right","ref_id":14728},{"sent":"orange on right","ref_id":14729},{"sent":"top right apple","ref_id":14730},{"sent":"left elephant","ref_id":14731},{"sent":"left elephant","ref_id":14732},{"sent":"elephant on right","ref_id":14733},{"sent":"right train","ref_id":14758},{"sent":"red train","ref_id":14759},{"sent":"broccoli","ref_id":14825},{"sent":"bottom left corner","ref_id":14826},{"sent":"left meter","ref_id":14846},{"sent":"right meter","ref_id":14847},{"sent":"middle fridge","ref_id":14848},{"sent":"left fridge","ref_id":14849},{"sent":"right fridge","ref_id":14850},{"sent":"red couch","ref_id":14942},{"sent":"right couch","ref_id":14943},{"sent":"bear on left","ref_id":14944},{"sent":"bear on the right","ref_id":14945},{"sent":"right giraffe","ref_id":14954},{"sent":"right giraffe","ref_id":14955},{"sent":"UNK","ref_id":15013},{"sent":"top cake","ref_id":15014},{"sent":"left sheep","ref_id":15022},{"sent":"white sheep","ref_id":15023},{"sent":"top right","ref_id":15024},{"sent":"top right sheep","ref_id":15025},{"sent":"sheep in front","ref_id":15026},{"sent":"top right broccoli","ref_id":15071},{"sent":"broccoli on right","ref_id":15072},{"sent":"broccoli in front","ref_id":15073},{"sent":"bike on right","ref_id":15135},{"sent":"bike on left","ref_id":15136},{"sent":"right bottle","ref_id":15241},{"sent":"right bottle","ref_id":15242},{"sent":"pizza on right","ref_id":15306},{"sent":"pizza on right","ref_id":15307},{"sent":"slice of pizza on left","ref_id":15308},{"sent":"top left slice","ref_id":15309},{"sent":"horse on left","ref_id":15310},{"sent":"horse on right","ref_id":15311},{"sent":"left bike","ref_id":15318},{"sent":"left bike","ref_id":15319},{"sent":"elephant on left","ref_id":15450},{"sent":"right elephant","ref_id":15451},{"sent":"elephant in front","ref_id":15452},{"sent":"elephant on left","ref_id":15469},{"sent":"right elephant","ref_id":15470},{"sent":"blue car","ref_id":15580},{"sent":"left car","ref_id":15581},{"sent":"pizza on right","ref_id":15686},{"sent":"pizza","ref_id":15687},{"sent":"chair on the left","ref_id":15741},{"sent":"left chair","ref_id":15742},{"sent":"bottom right","ref_id":15767},{"sent":"top hot dog","ref_id":15768},{"sent":"sandwich","ref_id":15877},{"sent":"left sandwich","ref_id":15878},{"sent":"left screen","ref_id":15975},{"sent":"left screen","ref_id":15976},{"sent":"right screen","ref_id":15977},{"sent":"bottom bench","ref_id":15984},{"sent":"right couch","ref_id":15985},{"sent":"left suitcase","ref_id":16044},{"sent":"second from left","ref_id":16045},{"sent":"red bus","ref_id":16049},{"sent":"bus on left","ref_id":16050},{"sent":"bus","ref_id":16051},{"sent":"second from right","ref_id":16238},{"sent":"second from left","ref_id":16239},{"sent":"bottom right corner","ref_id":16240},{"sent":"middle monitor","ref_id":16241},{"sent":"cow on right","ref_id":16348},{"sent":"white cow","ref_id":16349},{"sent":"cow on left","ref_id":16350},{"sent":"red bus on right","ref_id":16389},{"sent":"middle bus","ref_id":16390},{"sent":"left bear","ref_id":16394},{"sent":"bear on right","ref_id":16395},{"sent":"bear in middle","ref_id":16396},{"sent":"bottom left zebra","ref_id":16436},{"sent":"zebra in front","ref_id":16437},{"sent":"zebra on right","ref_id":16438},{"sent":"zebra in front","ref_id":16439},{"sent":"right side of cat","ref_id":16442},{"sent":"left one","ref_id":16465},{"sent":"chair on the right","ref_id":16477},{"sent":"right chair","ref_id":16478},{"sent":"chair on left","ref_id":16479},{"sent":"left chair","ref_id":16480},{"sent":"right giraffe","ref_id":16501},{"sent":"left giraffe","ref_id":16502},{"sent":"giraffe on left","ref_id":16503},{"sent":"left toothbrush","ref_id":16528},{"sent":"the UNK","ref_id":16529},{"sent":"screen","ref_id":16552},{"sent":"right book","ref_id":16553},{"sent":"bottom phone","ref_id":16554},{"sent":"right screen","ref_id":16555},{"sent":"right laptop","ref_id":16575},{"sent":"monitor on the right","ref_id":16576},{"sent":"white bus","ref_id":16622},{"sent":"yellow bus","ref_id":16623},{"sent":"right clock","ref_id":16642},{"sent":"clock on left","ref_id":16643},{"sent":"clock on right","ref_id":16644},{"sent":"bear in front","ref_id":16653},{"sent":"bear on right","ref_id":16654},{"sent":"woman","ref_id":16706},{"sent":"right half of person","ref_id":16707},{"sent":"bottom left corner","ref_id":16708},{"sent":"left side of table","ref_id":16709},{"sent":"bear on right","ref_id":16764},{"sent":"bear in middle","ref_id":16765},{"sent":"bear on left","ref_id":16766},{"sent":"right bottom corner","ref_id":16808},{"sent":"left bottom corner","ref_id":16809},{"sent":"right bottle","ref_id":16810},{"sent":"left bottle","ref_id":16811},{"sent":"second from right","ref_id":16812},{"sent":"second from right","ref_id":16813},{"sent":"second bottle from left","ref_id":16814},{"sent":"second bottle from left","ref_id":16815},{"sent":"second bottle from left","ref_id":16816},{"sent":"top dog","ref_id":16911},{"sent":"top hot dog","ref_id":16912},{"sent":"plant in front of the tree","ref_id":17021},{"sent":"green flowers","ref_id":17022},{"sent":"horse on right","ref_id":17140},{"sent":"horse on left","ref_id":17141},{"sent":"bottom left white cake","ref_id":17184},{"sent":"middle donut","ref_id":17185},{"sent":"right side of pic","ref_id":17301},{"sent":"bottom left corner","ref_id":17302},{"sent":"giraffe in back","ref_id":17382},{"sent":"right giraffe","ref_id":17383},{"sent":"white boat","ref_id":17427},{"sent":"left guy","ref_id":17428},{"sent":"boat on right","ref_id":17429},{"sent":"white boat","ref_id":17430},{"sent":"bed on left","ref_id":17585},{"sent":"bed","ref_id":17586},{"sent":"middle animal","ref_id":17806},{"sent":"chair in middle","ref_id":17822},{"sent":"chair on left","ref_id":17823},{"sent":"left bus","ref_id":17830},{"sent":"right bus","ref_id":17831},{"sent":"right bike","ref_id":17835},{"sent":"left bike","ref_id":17836},{"sent":"second bike from left","ref_id":17837},{"sent":"blue bike","ref_id":17838},{"sent":"red thing","ref_id":17894},{"sent":"red suitcase","ref_id":17895},{"sent":"cup on right","ref_id":18051},{"sent":"train on left","ref_id":18071},{"sent":"right train","ref_id":18072},{"sent":"kid on left","ref_id":18129},{"sent":"kid on right","ref_id":18130},{"sent":"right side of pic","ref_id":18131},{"sent":"white shirt","ref_id":18143},{"sent":"blond hair","ref_id":18144},{"sent":"right side of bike","ref_id":18171},{"sent":"left bike","ref_id":18172},{"sent":"right one","ref_id":18281},{"sent":"bird on left","ref_id":18282},{"sent":"pizza","ref_id":18295},{"sent":"top left bowl","ref_id":18296},{"sent":"right zebra","ref_id":18305},{"sent":"zebra in front","ref_id":18306},{"sent":"left elephant","ref_id":18443},{"sent":"baby elephant","ref_id":18444},{"sent":"left meter","ref_id":18462},{"sent":"right meter","ref_id":18463},{"sent":"left UNK","ref_id":18496},{"sent":"white book","ref_id":18497},{"sent":"giraffe in front","ref_id":18537},{"sent":"top giraffe","ref_id":18538},{"sent":"cat on left","ref_id":18681},{"sent":"cat on right","ref_id":18682},{"sent":"left sheep","ref_id":18698},{"sent":"left sheep","ref_id":18699},{"sent":"baby","ref_id":18700},{"sent":"right animal","ref_id":18726},{"sent":"right sheep","ref_id":18727},{"sent":"right slice","ref_id":18736},{"sent":"left pizza","ref_id":18737},{"sent":"giraffe on left","ref_id":18906},{"sent":"left giraffe","ref_id":18907},{"sent":"top left book","ref_id":18927},{"sent":"right horse","ref_id":19026},{"sent":"middle horse","ref_id":19027},{"sent":"top right bowl","ref_id":19032},{"sent":"bottom left bowl","ref_id":19033},{"sent":"bowl","ref_id":19034},{"sent":"bottom right bowl","ref_id":19035},{"sent":"oranges","ref_id":19125},{"sent":"oranges on left","ref_id":19126},{"sent":"orange in front","ref_id":19127},{"sent":"left side of the orange","ref_id":19249},{"sent":"white car on right","ref_id":19250},{"sent":"right book","ref_id":19533},{"sent":"bottom book","ref_id":19534},{"sent":"bottom book","ref_id":19535},{"sent":"UNK","ref_id":19536},{"sent":"bananas on right","ref_id":19589},{"sent":"left banana","ref_id":19590},{"sent":"chair on right","ref_id":19594},{"sent":"chair on right","ref_id":19595},{"sent":"left bear","ref_id":19626},{"sent":"right bear","ref_id":19627},{"sent":"left couch","ref_id":19651},{"sent":"bed on right","ref_id":19652},{"sent":"top left bowl","ref_id":19653},{"sent":"top right bowl","ref_id":19654},{"sent":"bottom left bowl","ref_id":19655},{"sent":"bottom right","ref_id":19656},{"sent":"chair on right","ref_id":19839},{"sent":"front bench","ref_id":19840},{"sent":"red boat","ref_id":19990},{"sent":"white plane on left","ref_id":19991},{"sent":"right train","ref_id":20030},{"sent":"train","ref_id":20031},{"sent":"left glass","ref_id":20032},{"sent":"middle glass","ref_id":20033},{"sent":"glass on right","ref_id":20034},{"sent":"middle animal","ref_id":20279},{"sent":"left cow","ref_id":20280},{"sent":"cow in middle","ref_id":20281},{"sent":"right dog","ref_id":20316},{"sent":"bottom right corner","ref_id":20317},{"sent":"black suitcase","ref_id":20318},{"sent":"top right car","ref_id":20331},{"sent":"red car","ref_id":20332},{"sent":"bottom suitcase","ref_id":20673},{"sent":"red suitcase","ref_id":20674},{"sent":"top suitcase","ref_id":20675},{"sent":"bottom suitcase","ref_id":20676},{"sent":"top right dog","ref_id":20733},{"sent":"cat on the right","ref_id":20734},{"sent":"car on left","ref_id":20793},{"sent":"car on the left","ref_id":20794},{"sent":"red car","ref_id":20865},{"sent":"white car","ref_id":20866},{"sent":"red","ref_id":20925},{"sent":"boat on right","ref_id":20926},{"sent":"bottom right corner","ref_id":20927},{"sent":"top right donut","ref_id":20981},{"sent":"right donut","ref_id":20982},{"sent":"white plate on the right","ref_id":21023},{"sent":"white plate on right","ref_id":21024},{"sent":"chair on right","ref_id":21053},{"sent":"couch","ref_id":21054},{"sent":"sandwich on left","ref_id":21162},{"sent":"sandwich on right","ref_id":21163},{"sent":"right couch","ref_id":21235},{"sent":"right bed","ref_id":21236},{"sent":"bottom right corner","ref_id":21256},{"sent":"bottom left UNK","ref_id":21257},{"sent":"bed on left","ref_id":21288},{"sent":"bed","ref_id":21289},{"sent":"bottom suitcase","ref_id":21638},{"sent":"left carrot","ref_id":21639},{"sent":"UNK","ref_id":21716},{"sent":"top right book","ref_id":21717},{"sent":"right bird","ref_id":21748},{"sent":"duck","ref_id":21749},{"sent":"chair on right","ref_id":21825},{"sent":"chair on left","ref_id":21826},{"sent":"glass on right","ref_id":21925},{"sent":"cup on right","ref_id":21926},{"sent":"left zebra","ref_id":21957},{"sent":"right zebra","ref_id":21958},{"sent":"horse on left","ref_id":22022},{"sent":"horse in front","ref_id":22023},{"sent":"left keyboard","ref_id":22040},{"sent":"keyboard on the right","ref_id":22041},{"sent":"black computer right","ref_id":22042},{"sent":"right monitor","ref_id":22043},{"sent":"top left pizza","ref_id":22050},{"sent":"right slice","ref_id":22051},{"sent":"middle piece of food","ref_id":22163},{"sent":"bottom right corner","ref_id":22164},{"sent":"table in front","ref_id":22289},{"sent":"bed in front","ref_id":22290},{"sent":"orange on top of orange","ref_id":22324},{"sent":"orange bottom left","ref_id":22325},{"sent":"white car","ref_id":22382},{"sent":"white car in back","ref_id":22383},{"sent":"white car","ref_id":22384},{"sent":"bottom right orange","ref_id":22434},{"sent":"bottom right corner","ref_id":22435},{"sent":"bottom orange","ref_id":22436},{"sent":"top left apple","ref_id":22437},{"sent":"bottom left apple","ref_id":22438},{"sent":"right train","ref_id":22473},{"sent":"left train","ref_id":22474},{"sent":"couch on left","ref_id":22576},{"sent":"right couch","ref_id":22577},{"sent":"right giraffe","ref_id":22596},{"sent":"left giraffe","ref_id":22597},{"sent":"right zebra","ref_id":22630},{"sent":"black bag on right","ref_id":22656},{"sent":"left blue bag","ref_id":22657},{"sent":"right seat","ref_id":22658},{"sent":"oranges in front","ref_id":22723},{"sent":"chair on right","ref_id":22754},{"sent":"bed on right","ref_id":22755},{"sent":"chair bottom right","ref_id":22756},{"sent":"left sheep","ref_id":22773},{"sent":"right sheep","ref_id":22774},{"sent":"right edge of pic","ref_id":22859},{"sent":"the woman in the middle","ref_id":22860},{"sent":"woman in middle","ref_id":22861},{"sent":"left vase","ref_id":22933},{"sent":"vase","ref_id":22934},{"sent":"left vase","ref_id":22935},{"sent":"blue car on left","ref_id":22943},{"sent":"car on right","ref_id":22944},{"sent":"black cat","ref_id":22945},{"sent":"red book","ref_id":22946},{"sent":"left monitor","ref_id":22966},{"sent":"right monitor","ref_id":22967},{"sent":"left bench","ref_id":23040},{"sent":"right couch","ref_id":23041},{"sent":"orange","ref_id":23088},{"sent":"orange top right","ref_id":23089},{"sent":"orange","ref_id":23090},{"sent":"orange","ref_id":23091},{"sent":"top right corner","ref_id":23092},{"sent":"top left apples","ref_id":23151},{"sent":"middle row second from right","ref_id":23152},{"sent":"middle row second from right","ref_id":23153},{"sent":"broccoli on right","ref_id":23182},{"sent":"broccoli on the right","ref_id":23183},{"sent":"broccoli in middle","ref_id":23184},{"sent":"middle row second from bottom","ref_id":23185},{"sent":"left banana","ref_id":23297},{"sent":"left hot dog","ref_id":23298},{"sent":"the UNK","ref_id":23313},{"sent":"top of train","ref_id":23314},{"sent":"right giraffe","ref_id":23347},{"sent":"baby","ref_id":23362},{"sent":"baby","ref_id":23363},{"sent":"right zebra","ref_id":23469},{"sent":"right zebra","ref_id":23470},{"sent":"left zebra","ref_id":23471},{"sent":"giraffe on left","ref_id":23509},{"sent":"right giraffe","ref_id":23510},{"sent":"left cow","ref_id":23569},{"sent":"cow in middle","ref_id":23570},{"sent":"cow in middle","ref_id":23571},{"sent":"chair on left","ref_id":23583},{"sent":"bottom right corner","ref_id":23584},{"sent":"left hotdog","ref_id":23603},{"sent":"top right corner","ref_id":23604},{"sent":"cat on left","ref_id":23659},{"sent":"cat on left","ref_id":23660},{"sent":"cat on the left","ref_id":23661},{"sent":"elephant on left","ref_id":23721},{"sent":"elephant on right","ref_id":23722},{"sent":"left horse","ref_id":23797},{"sent":"horse on right","ref_id":23798},{"sent":"bottle in middle","ref_id":23810},{"sent":"bottle on right","ref_id":23811},{"sent":"bottle on the left","ref_id":23812},{"sent":"bottle on left","ref_id":23813},{"sent":"bottle on left","ref_id":23814},{"sent":"top right piece of broccoli","ref_id":23878},{"sent":"left piece of food","ref_id":23879},{"sent":"broccoli on left","ref_id":23880},{"sent":"right piece of food","ref_id":23881},{"sent":"right","ref_id":23882},{"sent":"red thing","ref_id":23883},{"sent":"suitcase on the right","ref_id":24098},{"sent":"suitcase on left","ref_id":24099},{"sent":"left bear","ref_id":24120},{"sent":"right bear","ref_id":24121},{"sent":"right bear","ref_id":24122},{"sent":"right bear","ref_id":24123},{"sent":"left bear","ref_id":24124},{"sent":"baby elephant","ref_id":24187},{"sent":"chair in front of man","ref_id":24192},{"sent":"laptop on left","ref_id":24193},{"sent":"laptop on right","ref_id":24194},{"sent":"UNK","ref_id":24224},{"sent":"top right book","ref_id":24225},{"sent":"right most UNK","ref_id":24274},{"sent":"second glass from right","ref_id":24275},{"sent":"second glass from left","ref_id":24276},{"sent":"umbrella on left","ref_id":24402},{"sent":"top right umbrella","ref_id":24403},{"sent":"green bus on left","ref_id":24448},{"sent":"bus in front","ref_id":24449},{"sent":"second from right","ref_id":24503},{"sent":"second from left","ref_id":24504},{"sent":"left train","ref_id":24505},{"sent":"second from left","ref_id":24506},{"sent":"cat on the right","ref_id":24523},{"sent":"cat on the right","ref_id":24524},{"sent":"bottom left corner","ref_id":24573},{"sent":"right bottom corner","ref_id":24574},{"sent":"right horse","ref_id":24588},{"sent":"left horse","ref_id":24589},{"sent":"right umbrella","ref_id":24604},{"sent":"top right umbrella","ref_id":24605},{"sent":"umbrella on left","ref_id":24606},{"sent":"cow on right","ref_id":24684},{"sent":"cow on left","ref_id":24685},{"sent":"cow on right","ref_id":24686},{"sent":"bottom carrot","ref_id":24687},{"sent":"top left piece of food","ref_id":24688},{"sent":"top donut","ref_id":24778},{"sent":"top left hot dog","ref_id":24779},{"sent":"bike on right","ref_id":24859},{"sent":"bike","ref_id":24860},{"sent":"white UNK","ref_id":24943},{"sent":"white vase","ref_id":24944},{"sent":"white UNK","ref_id":24945},{"sent":"the UNK","ref_id":24946},{"sent":"right UNK","ref_id":25053},{"sent":"right meter","ref_id":25054},{"sent":"middle meter","ref_id":25055},{"sent":"bowl on left","ref_id":25137},{"sent":"dog on left","ref_id":25151},{"sent":"red dog","ref_id":25152},{"sent":"left monitor","ref_id":25302},{"sent":"right monitor","ref_id":25303},{"sent":"right most vase","ref_id":25313},{"sent":"vase on left","ref_id":25314},{"sent":"white book","ref_id":25336},{"sent":"horse on right","ref_id":25342},{"sent":"green stuff","ref_id":25445},{"sent":"green stuff","ref_id":25446},{"sent":"right couch","ref_id":25504},{"sent":"couch","ref_id":25505},{"sent":"bear in front","ref_id":25659},{"sent":"bear","ref_id":25660},{"sent":"white bear","ref_id":25694},{"sent":"bear","ref_id":25695},{"sent":"red bus","ref_id":25717},{"sent":"left bus","ref_id":25718},{"sent":"red bus on right","ref_id":25719},{"sent":"toilet in front","ref_id":25762},{"sent":"sink on the right","ref_id":25763},{"sent":"apple on left","ref_id":25788},{"sent":"right slice","ref_id":25789},{"sent":"glass on left","ref_id":25826},{"sent":"glass on right","ref_id":25827},{"sent":"right elephant","ref_id":25831},{"sent":"elephant on right","ref_id":25832},{"sent":"top right microwave","ref_id":25888},{"sent":"left monitor","ref_id":25889},{"sent":"broccoli on left","ref_id":26005},{"sent":"broccoli on right","ref_id":26006},{"sent":"broccoli in middle","ref_id":26007},{"sent":"bottom oven","ref_id":26157},{"sent":"bottom oven","ref_id":26158},{"sent":"keyboard","ref_id":26159},{"sent":"white keyboard","ref_id":26160},{"sent":"bottom left corner","ref_id":26344},{"sent":"left chair","ref_id":26345},{"sent":"couch","ref_id":26346},{"sent":"couch","ref_id":26347},{"sent":"left banana","ref_id":26384},{"sent":"banana in the back","ref_id":26385},{"sent":"boat in front","ref_id":26447},{"sent":"left boat","ref_id":26448},{"sent":"boat in front","ref_id":26449},{"sent":"middle boat","ref_id":26450},{"sent":"white truck","ref_id":26513},{"sent":"white truck","ref_id":26514},{"sent":"white truck","ref_id":26515},{"sent":"left bike","ref_id":26528},{"sent":"front bike","ref_id":26529},{"sent":"red chair","ref_id":26601},{"sent":"red chair","ref_id":26602},{"sent":"bird on right","ref_id":26618},{"sent":"bird on right","ref_id":26619},{"sent":"bear in front","ref_id":26825},{"sent":"right bear","ref_id":26826},{"sent":"left bus","ref_id":26844},{"sent":"bus in front","ref_id":26845},{"sent":"red light","ref_id":27005},{"sent":"traffic light","ref_id":27006},{"sent":"traffic light","ref_id":27007},{"sent":"middle giraffe","ref_id":27130},{"sent":"left woman","ref_id":27131},{"sent":"right bear","ref_id":27214},{"sent":"left bear","ref_id":27215},{"sent":"right cat","ref_id":27232},{"sent":"cat on left","ref_id":27233},{"sent":"zebra in front","ref_id":27247},{"sent":"right zebra","ref_id":27248},{"sent":"left zebra","ref_id":27249},{"sent":"right car","ref_id":27250},{"sent":"white car","ref_id":27251},{"sent":"top left microwave","ref_id":27288},{"sent":"microwave on right","ref_id":27289},{"sent":"toilet on left","ref_id":27314},{"sent":"toilet","ref_id":27315},{"sent":"sheep on right","ref_id":27373},{"sent":"sheep in front","ref_id":27374},{"sent":"left sandwich","ref_id":27432},{"sent":"right sandwich","ref_id":27433},{"sent":"cat on right","ref_id":27465},{"sent":"cat","ref_id":27466},{"sent":"yellow toothbrush","ref_id":27526},{"sent":"bottom brush","ref_id":27527},{"sent":"top right pizza","ref_id":27572},{"sent":"pizza","ref_id":27573},{"sent":"bottom left carrot","ref_id":27751},{"sent":"sandwich on right","ref_id":27796},{"sent":"sandwich on the left","ref_id":27797},{"sent":"left slice","ref_id":27848},{"sent":"right side of pizza","ref_id":27849},{"sent":"bottom menu","ref_id":27880},{"sent":"top book","ref_id":27881},{"sent":"book on right","ref_id":27882},{"sent":"keyboard on right","ref_id":27883},{"sent":"book in middle","ref_id":27935},{"sent":"UNK book","ref_id":27936},{"sent":"bowl of food","ref_id":27973},{"sent":"giraffe in front","ref_id":28408},{"sent":"left giraffe","ref_id":28409},{"sent":"left flower vase","ref_id":28439},{"sent":"right plant","ref_id":28440},{"sent":"left vase","ref_id":28441},{"sent":"right vase","ref_id":28442},{"sent":"train","ref_id":28852},{"sent":"top right corner","ref_id":28853},{"sent":"top right chair","ref_id":28854},{"sent":"cat on the right","ref_id":29103},{"sent":"cat on right","ref_id":29104},{"sent":"bottom left oven","ref_id":29105},{"sent":"right sink","ref_id":29106},{"sent":"middle giraffe","ref_id":29153},{"sent":"giraffe in front","ref_id":29154},{"sent":"dog in front","ref_id":29238},{"sent":"dog on left","ref_id":29239},{"sent":"front plate","ref_id":29270},{"sent":"right slice","ref_id":29271},{"sent":"baby","ref_id":29301},{"sent":"elephant","ref_id":29302},{"sent":"bottom right dish","ref_id":29360},{"sent":"bottom plate","ref_id":29361},{"sent":"giraffe on right","ref_id":29385},{"sent":"giraffe on right","ref_id":29386},{"sent":"giraffe in front","ref_id":29387},{"sent":"left giraffe","ref_id":29388},{"sent":"left giraffe","ref_id":29389},{"sent":"left bear","ref_id":29460},{"sent":"white bear","ref_id":29461},{"sent":"big bear","ref_id":29462},{"sent":"bowl of soup","ref_id":29569},{"sent":"white cup","ref_id":29570},{"sent":"white car on right","ref_id":29575},{"sent":"truck","ref_id":29576},{"sent":"second bike from left","ref_id":29625},{"sent":"second bike from left","ref_id":29626},{"sent":"bike on right","ref_id":29627},{"sent":"left train","ref_id":29630},{"sent":"left plane","ref_id":29631},{"sent":"left bench","ref_id":29856},{"sent":"bench in front","ref_id":29857},{"sent":"top right umbrella","ref_id":29920},{"sent":"top left umbrella","ref_id":29921},{"sent":"guy in back","ref_id":29964},{"sent":"bus","ref_id":29967},{"sent":"white car","ref_id":29974},{"sent":"car on left","ref_id":29975},{"sent":"middle screen","ref_id":30281},{"sent":"left monitor","ref_id":30282},{"sent":"elephant on left","ref_id":30393},{"sent":"right elephant","ref_id":30394},{"sent":"bed on left","ref_id":30401},{"sent":"bottom bed","ref_id":30402},{"sent":"left bed","ref_id":30403},{"sent":"plant on left","ref_id":30480},{"sent":"vase","ref_id":30481},{"sent":"left pot","ref_id":30482},{"sent":"cow in back","ref_id":30529},{"sent":"cow","ref_id":30530},{"sent":"bottom left suitcase","ref_id":30631},{"sent":"black suitcase","ref_id":30632},{"sent":"right suitcase","ref_id":30633},{"sent":"black suitcase","ref_id":30634},{"sent":"black cat","ref_id":30699},{"sent":"cat on left","ref_id":30700},{"sent":"bear on left","ref_id":30701},{"sent":"bear","ref_id":30702},{"sent":"motorcycle on right","ref_id":30719},{"sent":"front bike","ref_id":30720},{"sent":"front left bike","ref_id":30721},{"sent":"right dish","ref_id":30813},{"sent":"left bowl","ref_id":30814},{"sent":"cat on left","ref_id":30839},{"sent":"cat on the left","ref_id":30840},{"sent":"truck","ref_id":30869},{"sent":"white truck","ref_id":30870},{"sent":"glass with red liquid","ref_id":30970},{"sent":"glass on right","ref_id":30971},{"sent":"right meter","ref_id":30996},{"sent":"left meter","ref_id":30997},{"sent":"right screen","ref_id":31025},{"sent":"left monitor","ref_id":31026},{"sent":"giraffe in back","ref_id":31114},{"sent":"right giraffe","ref_id":31115},{"sent":"left sheep","ref_id":31161},{"sent":"right sheep","ref_id":31162},{"sent":"sandwich on right","ref_id":31324},{"sent":"sandwich on left","ref_id":31325},{"sent":"cup on right","ref_id":31373},{"sent":"top right cup","ref_id":31374},{"sent":"bowl of food on left","ref_id":31375},{"sent":"cup of coffee","ref_id":31376},{"sent":"cup on right","ref_id":31377},{"sent":"bowl of UNK","ref_id":31378},{"sent":"horse on left","ref_id":31391},{"sent":"horse on right","ref_id":31392},{"sent":"sheep on right","ref_id":31393},{"sent":"sheep in back","ref_id":31394},{"sent":"sheep on right","ref_id":31395},{"sent":"bottom right corner","ref_id":31396},{"sent":"bottom left sheep","ref_id":31397},{"sent":"bus on right","ref_id":31558},{"sent":"bus in front","ref_id":31559},{"sent":"left bus","ref_id":31560},{"sent":"front vase","ref_id":31579},{"sent":"vase on left","ref_id":31580},{"sent":"right vase","ref_id":31581},{"sent":"bike on right","ref_id":31594},{"sent":"red bike","ref_id":31595},{"sent":"red bike","ref_id":31596},{"sent":"truck on right","ref_id":31619},{"sent":"truck","ref_id":31620},{"sent":"right sandwich","ref_id":31687},{"sent":"left sandwich","ref_id":31688},{"sent":"white thing on right","ref_id":31703},{"sent":"horse on left","ref_id":31706},{"sent":"giraffe in middle","ref_id":31729},{"sent":"giraffe in front","ref_id":31730},{"sent":"right sheep","ref_id":31736},{"sent":"left sheep","ref_id":31737},{"sent":"right animal","ref_id":31758},{"sent":"left sheep","ref_id":31759},{"sent":"meter on the right","ref_id":31778},{"sent":"right meter","ref_id":31779},{"sent":"sheep in front","ref_id":31897},{"sent":"sheep in front","ref_id":31898},{"sent":"right sheep","ref_id":31899},{"sent":"left donut","ref_id":31960},{"sent":"right donut","ref_id":31961},{"sent":"umbrella on left","ref_id":31981},{"sent":"umbrella","ref_id":31982},{"sent":"elephant on left","ref_id":32094},{"sent":"elephant on right","ref_id":32095},{"sent":"right sandwich","ref_id":32165},{"sent":"left hot dog","ref_id":32166},{"sent":"slice of pizza","ref_id":32214},{"sent":"top oven","ref_id":32265},{"sent":"stove","ref_id":32266},{"sent":"motorcycle in front","ref_id":32311},{"sent":"front bike","ref_id":32312},{"sent":"left sandwich","ref_id":32362},{"sent":"bottom left food","ref_id":32363},{"sent":"right sandwich","ref_id":32364},{"sent":"bottom left bread","ref_id":32365},{"sent":"hand","ref_id":32370},{"sent":"hand","ref_id":32371},{"sent":"right screen","ref_id":32572},{"sent":"left monitor","ref_id":32573},{"sent":"dog on right","ref_id":32642},{"sent":"dog on left","ref_id":32643},{"sent":"zebra in back","ref_id":32928},{"sent":"zebra in front","ref_id":32929},{"sent":"left glass","ref_id":32956},{"sent":"person in back","ref_id":32957},{"sent":"pizza slice on top","ref_id":33014},{"sent":"pizza slice on right","ref_id":33015},{"sent":"pizza slice on right","ref_id":33016},{"sent":"bottom left pizza","ref_id":33017},{"sent":"top pizza","ref_id":33018},{"sent":"left UNK","ref_id":33237},{"sent":"bottom right corner","ref_id":33238},{"sent":"the little UNK","ref_id":33239},{"sent":"red vase","ref_id":33240},{"sent":"the little UNK","ref_id":33241},{"sent":"left vase","ref_id":33242},{"sent":"cat on right","ref_id":33291},{"sent":"left cat","ref_id":33292},{"sent":"zebra in front","ref_id":33439},{"sent":"right light","ref_id":33455},{"sent":"pizza on right","ref_id":33470},{"sent":"bottom left slice","ref_id":33471},{"sent":"right elephant","ref_id":33500},{"sent":"elephant on left","ref_id":33501},{"sent":"bottom donut","ref_id":33626},{"sent":"donut on right","ref_id":33627},{"sent":"donut on left","ref_id":33628},{"sent":"zebra on left","ref_id":33639},{"sent":"right train","ref_id":33681},{"sent":"left train","ref_id":33682},{"sent":"right bus","ref_id":33683},{"sent":"chair on right","ref_id":33684},{"sent":"top right dog","ref_id":33685},{"sent":"cat on left","ref_id":33686},{"sent":"red bike","ref_id":33714},{"sent":"front bike","ref_id":33715},{"sent":"bottom left cup","ref_id":33800},{"sent":"cup","ref_id":33801},{"sent":"elephant in front","ref_id":33806},{"sent":"elephant on right","ref_id":33807},{"sent":"bottom left bowl","ref_id":33829},{"sent":"bottom left cup","ref_id":33830},{"sent":"bowl of rice in back right","ref_id":33831},{"sent":"broccoli on left","ref_id":33914},{"sent":"bottom left broccoli","ref_id":33915},{"sent":"middle donut","ref_id":33952},{"sent":"middle donut","ref_id":33953},{"sent":"second from right","ref_id":33992},{"sent":"second from right","ref_id":33993},{"sent":"second from right","ref_id":33994},{"sent":"right carrot","ref_id":33995},{"sent":"right side of food","ref_id":33996},{"sent":"right most carrot","ref_id":33997},{"sent":"UNK","ref_id":34321},{"sent":"glass of water","ref_id":34322},{"sent":"elephant on right","ref_id":34631},{"sent":"elephant in front","ref_id":34632},{"sent":"top left food","ref_id":34787},{"sent":"top left food","ref_id":34788},{"sent":"left umbrella","ref_id":34858},{"sent":"left umbrella","ref_id":34859},{"sent":"bottom right bowl","ref_id":34895},{"sent":"car in front of the cart","ref_id":34943},{"sent":"car on left","ref_id":34944},{"sent":"pizza on right","ref_id":34998},{"sent":"pizza slice on left","ref_id":34999},{"sent":"UNK","ref_id":35034},{"sent":"left monitor","ref_id":35035},{"sent":"left keyboard","ref_id":35090},{"sent":"laptop on left","ref_id":35091},{"sent":"top left laptop","ref_id":35121},{"sent":"top left chair","ref_id":35122},{"sent":"chair on right","ref_id":35148},{"sent":"couch on right","ref_id":35149},{"sent":"broccoli on right","ref_id":35188},{"sent":"broccoli on the right","ref_id":35189},{"sent":"broccoli on left","ref_id":35190},{"sent":"broccoli in middle","ref_id":35191},{"sent":"middle bird","ref_id":35194},{"sent":"left cat","ref_id":35195},{"sent":"bird on left","ref_id":35217},{"sent":"bird on the right","ref_id":35218},{"sent":"black bag on left","ref_id":35368},{"sent":"black bag on top of suitcase","ref_id":35369},{"sent":"right suitcase","ref_id":35370},{"sent":"right suitcase","ref_id":35371},{"sent":"black suitcase on top of suitcase","ref_id":35372},{"sent":"top right orange","ref_id":35377},{"sent":"middle row second from right","ref_id":35378},{"sent":"second from left","ref_id":35379},{"sent":"stove top right","ref_id":35391},{"sent":"oven","ref_id":35392},{"sent":"left girl","ref_id":35420},{"sent":"chair on right","ref_id":35421},{"sent":"top right apple","ref_id":35522},{"sent":"orange","ref_id":35523},{"sent":"orange bottom right","ref_id":35524},{"sent":"bottom right apple","ref_id":35525},{"sent":"top apple","ref_id":35526},{"sent":"right suitcase","ref_id":35764},{"sent":"right bag","ref_id":35765},{"sent":"left couch","ref_id":35833},{"sent":"right couch","ref_id":35834},{"sent":"right zebra","ref_id":35913},{"sent":"left zebra","ref_id":35914},{"sent":"green plant","ref_id":36001},{"sent":"bottom right corner","ref_id":36002},{"sent":"cat on right","ref_id":36082},{"sent":"cat","ref_id":36083},{"sent":"white truck","ref_id":36111},{"sent":"white truck","ref_id":36112},{"sent":"right zebra","ref_id":36209},{"sent":"left zebra","ref_id":36210},{"sent":"bottom right food","ref_id":36224},{"sent":"right pizza","ref_id":36225},{"sent":"top right banana","ref_id":36279},{"sent":"bottom left apple","ref_id":36280},{"sent":"banana in middle","ref_id":36281},{"sent":"UNK","ref_id":36365},{"sent":"top left umbrella","ref_id":36366},{"sent":"bottom right microwave","ref_id":36432},{"sent":"bottom right corner","ref_id":36433},{"sent":"left monitor","ref_id":36434},{"sent":"top right microwave","ref_id":36435},{"sent":"left microwave","ref_id":36436},{"sent":"sandwich","ref_id":36689},{"sent":"top sandwich","ref_id":36690},{"sent":"bottom left orange","ref_id":36725},{"sent":"orange peel","ref_id":36726},{"sent":"right glass","ref_id":36762},{"sent":"right glass","ref_id":36763},{"sent":"glass on left","ref_id":36764},{"sent":"wine bottle on right","ref_id":36789},{"sent":"left glass","ref_id":36790},{"sent":"right giraffe","ref_id":36894},{"sent":"left giraffe","ref_id":36895},{"sent":"red sweater","ref_id":36900},{"sent":"bear on right","ref_id":36901},{"sent":"green and white UNK","ref_id":36902},{"sent":"bear on right","ref_id":36903},{"sent":"right animal","ref_id":37150},{"sent":"right animal","ref_id":37151},{"sent":"chair in middle","ref_id":37201},{"sent":"right chair","ref_id":37202},{"sent":"left chair","ref_id":37203},{"sent":"white plane","ref_id":37213},{"sent":"plane in front","ref_id":37214},{"sent":"bottom left corner","ref_id":37252},{"sent":"vase on right","ref_id":37253},{"sent":"right vase","ref_id":37254},{"sent":"UNK","ref_id":37255},{"sent":"bananas on left","ref_id":37278},{"sent":"right bunch","ref_id":37279},{"sent":"slice of pizza in front","ref_id":37540},{"sent":"pizza","ref_id":37541},{"sent":"white umbrella","ref_id":37572},{"sent":"bottom right corner","ref_id":37573},{"sent":"left bowl","ref_id":37650},{"sent":"broccoli","ref_id":37651},{"sent":"top right bowl","ref_id":37652},{"sent":"bowl of UNK","ref_id":37661},{"sent":"bowl of UNK","ref_id":37662},{"sent":"bowl of rice","ref_id":37663},{"sent":"black suitcase on left","ref_id":37710},{"sent":"suitcase on left","ref_id":37711},{"sent":"right suitcase","ref_id":37712},{"sent":"zebra on left","ref_id":37749},{"sent":"zebra in back","ref_id":37750},{"sent":"bottom left corner","ref_id":37802},{"sent":"middle cake","ref_id":37803},{"sent":"bed on right","ref_id":37879},{"sent":"couch on left","ref_id":37880},{"sent":"couch on left","ref_id":37895},{"sent":"couch","ref_id":37896},{"sent":"bottom right corner","ref_id":37933},{"sent":"cow in front","ref_id":37963},{"sent":"black cow","ref_id":37964},{"sent":"bottom phone","ref_id":38029},{"sent":"right phone","ref_id":38030},{"sent":"banana on top","ref_id":38151},{"sent":"top left sandwich","ref_id":38266},{"sent":"sandwich","ref_id":38267},{"sent":"food in front","ref_id":38268},{"sent":"second from right","ref_id":38333},{"sent":"second board from right","ref_id":38334},{"sent":"left board","ref_id":38335},{"sent":"right","ref_id":38368},{"sent":"bottom left","ref_id":38369},{"sent":"top left dish","ref_id":38370},{"sent":"carrots","ref_id":38371},{"sent":"right bear","ref_id":38388},{"sent":"left bear","ref_id":38389},{"sent":"top dog","ref_id":38567},{"sent":"dog","ref_id":38568},{"sent":"clock face","ref_id":38601},{"sent":"clock on left","ref_id":38602},{"sent":"clock on right","ref_id":38603},{"sent":"white plane","ref_id":38647},{"sent":"bed on right","ref_id":38648},{"sent":"bed","ref_id":38649},{"sent":"top right corner","ref_id":38720},{"sent":"right bottom corner","ref_id":38721},{"sent":"top dog","ref_id":38779},{"sent":"left dog","ref_id":38780},{"sent":"top right dog","ref_id":38781},{"sent":"right glass","ref_id":38783},{"sent":"chair on right","ref_id":38903},{"sent":"red shirt","ref_id":38919},{"sent":"top right corner","ref_id":38920},{"sent":"right","ref_id":39016},{"sent":"left vase","ref_id":39017},{"sent":"right bottle","ref_id":39018},{"sent":"left elephant","ref_id":39024},{"sent":"elephant on right","ref_id":39025},{"sent":"left elephant","ref_id":39026},{"sent":"giraffe in middle","ref_id":39045},{"sent":"left train","ref_id":39150},{"sent":"front train","ref_id":39151},{"sent":"cat on the left","ref_id":39159},{"sent":"top cat","ref_id":39160},{"sent":"left suitcase","ref_id":39206},{"sent":"right","ref_id":39207},{"sent":"giraffe in front","ref_id":39387},{"sent":"giraffe in front","ref_id":39388},{"sent":"white teddy bear on right","ref_id":39400},{"sent":"teddy bear in middle","ref_id":39401},{"sent":"bear on left","ref_id":39402},{"sent":"bottom right bear","ref_id":39403},{"sent":"bear on left","ref_id":39404},{"sent":"bear on right","ref_id":39405},{"sent":"left elephant","ref_id":39448},{"sent":"elephant in front","ref_id":39449},{"sent":"cow on left","ref_id":39456},{"sent":"cow on right","ref_id":39457},{"sent":"right sandwich","ref_id":39460},{"sent":"sandwich on left","ref_id":39461},{"sent":"left chair","ref_id":39765},{"sent":"chair on left","ref_id":39766},{"sent":"left cow","ref_id":39797},{"sent":"cow on right","ref_id":39798},{"sent":"green apple","ref_id":39815},{"sent":"green apple","ref_id":39816},{"sent":"green apple","ref_id":39817},{"sent":"right zebra","ref_id":39850},{"sent":"zebra in front","ref_id":39851},{"sent":"right pizza","ref_id":39883},{"sent":"pizza in front","ref_id":39884},{"sent":"pizza on left","ref_id":39885},{"sent":"right laptop","ref_id":39891},{"sent":"left laptop","ref_id":39892},{"sent":"left truck","ref_id":39963},{"sent":"red truck","ref_id":39964},{"sent":"top left orange","ref_id":39965},{"sent":"orange","ref_id":39966},{"sent":"cat on left","ref_id":40063},{"sent":"cat on right","ref_id":40064},{"sent":"bed","ref_id":40188},{"sent":"bed on left","ref_id":40189},{"sent":"top right microwave","ref_id":40211},{"sent":"middle UNK","ref_id":40212},{"sent":"right phone","ref_id":40213},{"sent":"left cake","ref_id":40235},{"sent":"left cake","ref_id":40236},{"sent":"right cake","ref_id":40237},{"sent":"bottom sandwich","ref_id":40238},{"sent":"left most seat","ref_id":40248},{"sent":"left suitcase","ref_id":40249},{"sent":"bottom right bowl","ref_id":40287},{"sent":"top right bowl","ref_id":40288},{"sent":"table","ref_id":40350},{"sent":"table behind the table","ref_id":40351},{"sent":"left person","ref_id":40358},{"sent":"cat on right","ref_id":40400},{"sent":"left cat","ref_id":40401},{"sent":"the plant","ref_id":40456},{"sent":"left bunch","ref_id":40457},{"sent":"right dog","ref_id":40458},{"sent":"dog on left","ref_id":40459},{"sent":"left bike","ref_id":40479},{"sent":"front bike","ref_id":40480},{"sent":"bike on right","ref_id":40500},{"sent":"bike in front","ref_id":40501},{"sent":"bike on left","ref_id":40502},{"sent":"top bowl","ref_id":40554},{"sent":"right sandwich","ref_id":40555},{"sent":"cow in front","ref_id":40571},{"sent":"cow on right","ref_id":40572},{"sent":"left meter","ref_id":40753},{"sent":"left car","ref_id":40754},{"sent":"bottom left orange","ref_id":40762},{"sent":"orange","ref_id":40763},{"sent":"left giraffe","ref_id":40804},{"sent":"giraffe on right","ref_id":40805},{"sent":"bottom left fruit","ref_id":40810},{"sent":"banana slice in the middle","ref_id":40811},{"sent":"banana on right","ref_id":40812},{"sent":"toilet","ref_id":40909},{"sent":"toilet","ref_id":40910},{"sent":"second from right","ref_id":40945},{"sent":"second row from right","ref_id":40946},{"sent":"second from right","ref_id":40947},{"sent":"second banana from left","ref_id":40948},{"sent":"second row from left","ref_id":40949},{"sent":"left elephant","ref_id":41094},{"sent":"baby elephant","ref_id":41095},{"sent":"right cow","ref_id":41136},{"sent":"giraffe in front","ref_id":41137},{"sent":"chair on right","ref_id":41148},{"sent":"bed","ref_id":41167},{"sent":"right bed","ref_id":41168},{"sent":"left screen","ref_id":41173},{"sent":"right monitor","ref_id":41174},{"sent":"left white chair","ref_id":41197},{"sent":"chair on right","ref_id":41198},{"sent":"UNK","ref_id":41209},{"sent":"blue UNK","ref_id":41351},{"sent":"right giraffe","ref_id":41359},{"sent":"giraffe on left","ref_id":41360},{"sent":"right bed","ref_id":41531},{"sent":"red bed","ref_id":41532},{"sent":"left giraffe","ref_id":41551},{"sent":"right giraffe","ref_id":41552},{"sent":"sheep in back","ref_id":41743},{"sent":"sheep in front","ref_id":41744},{"sent":"broccoli on left","ref_id":41795},{"sent":"broccoli on right","ref_id":41796},{"sent":"broccoli on right","ref_id":41797},{"sent":"broccoli on left","ref_id":41798},{"sent":"broccoli on left","ref_id":41799},{"sent":"bottom right food","ref_id":41805},{"sent":"left bowl","ref_id":41806},{"sent":"white bowl on right","ref_id":41807},{"sent":"right most food","ref_id":41808},{"sent":"bowl of food on left","ref_id":41809},{"sent":"sheep in back","ref_id":41877},{"sent":"right sheep","ref_id":41878},{"sent":"cat on right","ref_id":41938},{"sent":"cat on left","ref_id":41939},{"sent":"left horse","ref_id":42136},{"sent":"left horse","ref_id":42137},{"sent":"bird","ref_id":42284},{"sent":"bird","ref_id":42285},{"sent":"left bird","ref_id":42296},{"sent":"duck","ref_id":42297},{"sent":"red surfboard","ref_id":42329},{"sent":"white boat","ref_id":42330},{"sent":"bottom left toilet","ref_id":42354},{"sent":"white toilet","ref_id":42355},{"sent":"toilet on left","ref_id":42356},{"sent":"toilet on right","ref_id":42357},{"sent":"toilet on left","ref_id":42358},{"sent":"toilet on left","ref_id":42359},{"sent":"bananas on left","ref_id":42428},{"sent":"banana bunch","ref_id":42429},{"sent":"bowl of food on right","ref_id":42658},{"sent":"bowl of food","ref_id":42659},{"sent":"bowl of food","ref_id":42660},{"sent":"left donut","ref_id":42697},{"sent":"bottom right donut","ref_id":42698},{"sent":"train","ref_id":42839},{"sent":"left bird","ref_id":42922},{"sent":"bird on right","ref_id":42923},{"sent":"zebra on left","ref_id":42928},{"sent":"zebra on right","ref_id":42929},{"sent":"zebra in back","ref_id":42930},{"sent":"sandwich on top","ref_id":43160},{"sent":"top left sandwich","ref_id":43161},{"sent":"cake","ref_id":43210},{"sent":"bottom left suitcase","ref_id":43211},{"sent":"bird on the left","ref_id":43396},{"sent":"plate of food","ref_id":43446},{"sent":"top bowl","ref_id":43447},{"sent":"plate with lettuce","ref_id":43448},{"sent":"sandwich on right","ref_id":43449},{"sent":"right sandwich","ref_id":43480},{"sent":"sandwich on left","ref_id":43481},{"sent":"toilet","ref_id":43581},{"sent":"toilet","ref_id":43582},{"sent":"cat on right","ref_id":43596},{"sent":"black chair bottom right","ref_id":43597},{"sent":"giraffe on right","ref_id":43598},{"sent":"left giraffe","ref_id":43599},{"sent":"right sandwich","ref_id":43700},{"sent":"sandwich on left","ref_id":43701},{"sent":"red truck","ref_id":43784},{"sent":"truck","ref_id":43785},{"sent":"banana on right","ref_id":43815},{"sent":"right banana","ref_id":43816},{"sent":"bottom right corner","ref_id":43817},{"sent":"banana on right","ref_id":43818},{"sent":"middle banana","ref_id":43819},{"sent":"cow on left","ref_id":43940},{"sent":"top cow","ref_id":43941},{"sent":"black suitcase on right","ref_id":43987},{"sent":"black suitcase on right","ref_id":43988},{"sent":"red suitcase","ref_id":43989},{"sent":"black suitcase in middle","ref_id":43990},{"sent":"baby","ref_id":44146},{"sent":"teddy bear","ref_id":44147},{"sent":"bottom left food","ref_id":44160},{"sent":"right plate","ref_id":44161},{"sent":"bottom tray","ref_id":44162},{"sent":"top plate","ref_id":44163},{"sent":"right bed","ref_id":44228},{"sent":"bed on left","ref_id":44229},{"sent":"bottom left bowl","ref_id":44296},{"sent":"bottom banana","ref_id":44297},{"sent":"front giraffe","ref_id":44300},{"sent":"giraffe on right","ref_id":44301},{"sent":"plane in front","ref_id":44369},{"sent":"plane on left","ref_id":44370},{"sent":"plane in front","ref_id":44408},{"sent":"plane in front","ref_id":44409},{"sent":"top right cake","ref_id":44426},{"sent":"left half of sandwich","ref_id":44427},{"sent":"left sandwich","ref_id":44428},{"sent":"right half of sandwich","ref_id":44429},{"sent":"bottom left bread","ref_id":44430},{"sent":"bottom left cake","ref_id":44431},{"sent":"right half of sandwich","ref_id":44432},{"sent":"white toilet","ref_id":44454},{"sent":"left sheep","ref_id":44463},{"sent":"sheep in front","ref_id":44464},{"sent":"sandwich on right","ref_id":44467},{"sent":"left sandwich","ref_id":44468},{"sent":"right car","ref_id":44514},{"sent":"car on left","ref_id":44515},{"sent":"yellow cab","ref_id":44516},{"sent":"right toilet","ref_id":44573},{"sent":"toilet on left","ref_id":44574},{"sent":"bottom right bear","ref_id":44661},{"sent":"bear on left","ref_id":44662},{"sent":"bear on right","ref_id":44663},{"sent":"right meter","ref_id":44814},{"sent":"left meter","ref_id":44815},{"sent":"red sauce","ref_id":44856},{"sent":"glass on left","ref_id":44857},{"sent":"bear on right","ref_id":44884},{"sent":"brown bear","ref_id":44885},{"sent":"bear on left","ref_id":44886},{"sent":"bear on right","ref_id":44887},{"sent":"teddy bear on left","ref_id":44888},{"sent":"left meter","ref_id":44941},{"sent":"right meter","ref_id":44942},{"sent":"UNK","ref_id":44989},{"sent":"top right corner","ref_id":44990},{"sent":"bed","ref_id":45056},{"sent":"bottom left suitcase","ref_id":45057},{"sent":"blue umbrella","ref_id":45127},{"sent":"left blue vase","ref_id":45128},{"sent":"blue umbrella","ref_id":45129},{"sent":"left blue umbrella","ref_id":45130},{"sent":"bottom right corner","ref_id":45131},{"sent":"glass of the beer","ref_id":45256},{"sent":"right side of table","ref_id":45257},{"sent":"white bus","ref_id":45498},{"sent":"bus in middle","ref_id":45499},{"sent":"right bunch of bananas","ref_id":45503},{"sent":"bananas on right","ref_id":45504},{"sent":"bananas on the left","ref_id":45505},{"sent":"elephant on right","ref_id":45533},{"sent":"elephant on left","ref_id":45534},{"sent":"elephant in front","ref_id":45535},{"sent":"left zebra","ref_id":45645},{"sent":"right zebra","ref_id":45646},{"sent":"middle row","ref_id":45680},{"sent":"bottom right cake","ref_id":45681},{"sent":"white car in back","ref_id":45957},{"sent":"train","ref_id":45958},{"sent":"pizza","ref_id":45972},{"sent":"pizza slice","ref_id":45973},{"sent":"table in front","ref_id":45974},{"sent":"top right corner","ref_id":45975},{"sent":"sandwich on left","ref_id":45976},{"sent":"right half of sandwich","ref_id":45977},{"sent":"blue thing","ref_id":45984},{"sent":"blue thing","ref_id":45985},{"sent":"chair behind dog","ref_id":46007},{"sent":"chair on right","ref_id":46008},{"sent":"donut on right","ref_id":46423},{"sent":"bottom row second from left","ref_id":46439},{"sent":"bottom row second from left","ref_id":46440},{"sent":"second row from bottom right","ref_id":46441},{"sent":"right giraffe","ref_id":46476},{"sent":"giraffe on left","ref_id":46477},{"sent":"truck on right","ref_id":46501},{"sent":"white truck","ref_id":46502},{"sent":"white truck","ref_id":46503},{"sent":"right elephant","ref_id":46569},{"sent":"middle elephant","ref_id":46570},{"sent":"right zebra","ref_id":46668},{"sent":"zebra in front","ref_id":46669},{"sent":"top pizza","ref_id":46684},{"sent":"pizza","ref_id":46685},{"sent":"car in front","ref_id":46724},{"sent":"car in front","ref_id":46725},{"sent":"sheep in front","ref_id":46744},{"sent":"bottom right bowl","ref_id":46773},{"sent":"bowl of food on left","ref_id":46774},{"sent":"middle bowl","ref_id":46775},{"sent":"left pizza","ref_id":46796},{"sent":"pizza on right","ref_id":46797},{"sent":"pizza on the right","ref_id":46798},{"sent":"bear on left","ref_id":46817},{"sent":"right bear","ref_id":46818},{"sent":"right plant","ref_id":46965},{"sent":"left UNK","ref_id":46966},{"sent":"left suitcase","ref_id":46986},{"sent":"right suitcase","ref_id":46987},{"sent":"right suitcase","ref_id":46988},{"sent":"right suitcase","ref_id":46989},{"sent":"suitcase in middle","ref_id":46990},{"sent":"top carrot","ref_id":47273},{"sent":"top carrot","ref_id":47274},{"sent":"top carrot","ref_id":47275},{"sent":"top carrot","ref_id":47276},{"sent":"bottom left carrot","ref_id":47277},{"sent":"right giraffe","ref_id":47305},{"sent":"left giraffe","ref_id":47306},{"sent":"bowl of food","ref_id":47310},{"sent":"bowl of rice","ref_id":47311},{"sent":"top right bowl","ref_id":47312},{"sent":"train on right","ref_id":47313},{"sent":"train on right","ref_id":47314},{"sent":"train on left","ref_id":47315},{"sent":"chair on left","ref_id":47318},{"sent":"right giraffe","ref_id":47366},{"sent":"left zebra","ref_id":47367},{"sent":"white cow","ref_id":47450},{"sent":"big cow","ref_id":47451},{"sent":"white sheep on right","ref_id":47452},{"sent":"white sheep on right","ref_id":47453},{"sent":"top right sheep","ref_id":47454},{"sent":"white cow","ref_id":47455},{"sent":"top left suitcase","ref_id":47529},{"sent":"white boat on left","ref_id":47530},{"sent":"white boat","ref_id":47531},{"sent":"left giraffe","ref_id":47603},{"sent":"right giraffe","ref_id":47604},{"sent":"white plate","ref_id":47644},{"sent":"top left food","ref_id":47645},{"sent":"bear on right","ref_id":47740},{"sent":"bear on right","ref_id":47741},{"sent":"bear in middle","ref_id":47742},{"sent":"bear on right","ref_id":47743},{"sent":"bed on left","ref_id":47840},{"sent":"bed","ref_id":47841},{"sent":"black dog","ref_id":47875},{"sent":"dog on right","ref_id":47876},{"sent":"giraffe in front","ref_id":47931},{"sent":"giraffe on left","ref_id":47932},{"sent":"left person","ref_id":47957},{"sent":"giraffe on left","ref_id":48055},{"sent":"giraffe","ref_id":48056},{"sent":"bowl of food","ref_id":48175},{"sent":"right bowl","ref_id":48176},{"sent":"zebra in front","ref_id":48302},{"sent":"right zebra","ref_id":48303},{"sent":"bottom left food","ref_id":48441},{"sent":"left piece of food","ref_id":48442},{"sent":"top right food","ref_id":48443},{"sent":"right food","ref_id":48444},{"sent":"left vase","ref_id":48545},{"sent":"right vase","ref_id":48546},{"sent":"second bike from right","ref_id":48585},{"sent":"second from right","ref_id":48586},{"sent":"second from left","ref_id":48587},{"sent":"left bike","ref_id":48588},{"sent":"second bike from right","ref_id":48589},{"sent":"right half of sandwich","ref_id":48623},{"sent":"left sandwich","ref_id":48624},{"sent":"right sandwich","ref_id":48625},{"sent":"bottom right chair","ref_id":48681},{"sent":"right couch","ref_id":48682},{"sent":"bottom left corner","ref_id":48683},{"sent":"couch on right","ref_id":48684},{"sent":"couch","ref_id":48685},{"sent":"left couch","ref_id":48686},{"sent":"left cake","ref_id":48861},{"sent":"right cake","ref_id":48862},{"sent":"left elephant","ref_id":48865},{"sent":"baby elephant","ref_id":48866},{"sent":"right elephant","ref_id":48888},{"sent":"left elephant","ref_id":48889},{"sent":"table in front","ref_id":49111},{"sent":"right truck","ref_id":49165},{"sent":"truck","ref_id":49166},{"sent":"truck","ref_id":49167},{"sent":"right truck","ref_id":49168},{"sent":"right bike","ref_id":49248},{"sent":"right bike","ref_id":49249},{"sent":"right bike","ref_id":49250},{"sent":"donut in middle","ref_id":49288},{"sent":"donut on left","ref_id":49289},{"sent":"donut in middle","ref_id":49290},{"sent":"giraffe on left","ref_id":49291},{"sent":"left giraffe","ref_id":49292},{"sent":"giraffe in front","ref_id":49293},{"sent":"left cup","ref_id":49377},{"sent":"right cup","ref_id":49378},{"sent":"left bottle","ref_id":49429},{"sent":"bottle on right","ref_id":49430},{"sent":"pizza","ref_id":49446},{"sent":"pizza slice on right","ref_id":49447},{"sent":"pizza slice on right","ref_id":49448},{"sent":"broccoli on the right","ref_id":49455},{"sent":"broccoli in the middle","ref_id":49456},{"sent":"second board from right","ref_id":49502},{"sent":"blue board","ref_id":49503},{"sent":"right plant","ref_id":49583},{"sent":"left plant","ref_id":49584},{"sent":"black suitcase","ref_id":49672},{"sent":"blue tie","ref_id":49673},{"sent":"middle bus","ref_id":49701},{"sent":"second bus from right","ref_id":49702},{"sent":"right bus","ref_id":49703},{"sent":"right suitcase","ref_id":49721},{"sent":"right side of pizza","ref_id":49781},{"sent":"right slice","ref_id":49782},{"sent":"dog on right","ref_id":49818},{"sent":"left dog","ref_id":49819},{"sent":"left car","ref_id":49824},{"sent":"white car","ref_id":49825},{"sent":"right cup","ref_id":49949},{"sent":"right cup","ref_id":49950},{"sent":"zebra in back","ref_id":49986},{"sent":"zebra on left","ref_id":49987},{"sent":"car on left","ref_id":25},{"sent":"car on left","ref_id":26},{"sent":"top sandwich","ref_id":27},{"sent":"top left donut","ref_id":28},{"sent":"zebra on left","ref_id":45},{"sent":"right zebra","ref_id":46},{"sent":"chair in front of man","ref_id":164},{"sent":"bottom right corner","ref_id":165},{"sent":"left chair","ref_id":166},{"sent":"top right corner","ref_id":232},{"sent":"pizza in front","ref_id":233},{"sent":"glass in back","ref_id":234},{"sent":"left glass","ref_id":235},{"sent":"yellow fruit on left","ref_id":259}]} \ No newline at end of file diff --git a/train_baseline.py b/train_baseline.py new file mode 100644 index 0000000..54541ae --- /dev/null +++ b/train_baseline.py @@ -0,0 +1,622 @@ +# coding=utf-8 +# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team. +# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +"""BERT finetuning runner.""" + +import argparse +import json +import logging +import os +import random +from io import open +import math +import sys + +from time import gmtime, strftime +from timeit import default_timer as timer + +import numpy as np +from tensorboardX import SummaryWriter +from tqdm import tqdm, trange + +import torch +from torch.utils.data import DataLoader, Dataset, RandomSampler +from torch.utils.data.distributed import DistributedSampler + +from pytorch_pretrained_bert.tokenization import BertTokenizer +from pytorch_pretrained_bert.optimization import BertAdam, WarmupLinearSchedule +from pytorch_pretrained_bert import BertModel + +from vilbert.datasets import ConceptCapLoaderTrain, ConceptCapLoaderVal +from vilbert.basebert import BertForMultiModalPreTraining +from pytorch_pretrained_bert.modeling import BertConfig +import pdb + +logging.basicConfig( + format="%(asctime)s - %(levelname)s - %(name)s - %(message)s", + datefmt="%m/%d/%Y %H:%M:%S", + level=logging.INFO, +) +logger = logging.getLogger(__name__) + + +def main(): + parser = argparse.ArgumentParser() + + # Required parameters + parser.add_argument( + "--train_file", + default="data/conceptual_caption/training", + type=str, + # required=True, + help="The input train corpus.", + ) + parser.add_argument( + "--validation_file", + default="data/conceptual_caption/validation", + type=str, + # required=True, + help="The input train corpus.", + ) + parser.add_argument( + "--pretrained_weight", + default="bert-base-uncased", + type=str, + help="Bert pre-trained model selected in the list: bert-base-uncased, " + "bert-large-uncased, bert-base-cased, bert-base-multilingual, bert-base-chinese.", + ) + + parser.add_argument( + "--bert_model", + default="bert-base-uncased", + type=str, + help="Bert pre-trained model selected in the list: bert-base-uncased, " + "bert-large-uncased, bert-base-cased, bert-base-multilingual, bert-base-chinese.", + ) + parser.add_argument( + "--output_dir", + default="save", + type=str, + # required=True, + help="The output directory where the model checkpoints will be written.", + ) + + parser.add_argument( + "--config_file", + default="config/bert_config.json", + type=str, + # required=True, + help="The config file which specified the model details.", + ) + ## Other parameters + parser.add_argument( + "--max_seq_length", + default=36, + type=int, + help="The maximum total input sequence length after WordPiece tokenization. \n" + "Sequences longer than this will be truncated, and sequences shorter \n" + "than this will be padded.", + ) + parser.add_argument("--predict_feature", action="store_true", help="visual target.") + parser.add_argument( + "--use_location", action="store_true", help="whether use location." + ) + parser.add_argument( + "--do_train", action="store_true", help="Whether to run training." + ) + parser.add_argument( + "--train_batch_size", + default=512, + type=int, + help="Total batch size for training.", + ) + parser.add_argument( + "--learning_rate", + default=1e-4, + type=float, + help="The initial learning rate for Adam.", + ) + parser.add_argument( + "--num_train_epochs", + default=10.0, + type=float, + help="Total number of training epochs to perform.", + ) + parser.add_argument( + "--warmup_proportion", + default=0.1, + type=float, + help="Proportion of training to perform linear learning rate warmup for. " + "E.g., 0.1 = 10%% of training.", + ) + parser.add_argument( + "--img_weight", default=1, type=float, help="weight for image loss" + ) + parser.add_argument( + "--no_cuda", action="store_true", help="Whether not to use CUDA when available" + ) + parser.add_argument( + "--on_memory", + action="store_true", + help="Whether to load train samples into memory or use disk", + ) + parser.add_argument( + "--do_lower_case", + action="store_true", + help="Whether to lower case the input text. True for uncased models, False for cased models.", + ) + parser.add_argument( + "--local_rank", + type=int, + default=-1, + help="local_rank for distributed training on gpus", + ) + parser.add_argument( + "--seed", type=int, default=42, help="random seed for initialization" + ) + parser.add_argument( + "--gradient_accumulation_steps", + type=int, + default=1, + help="Number of updates steps to accumualte before performing a backward/update pass.", + ) + parser.add_argument( + "--fp16", + action="store_true", + help="Whether to use 16-bit float precision instead of 32-bit", + ) + parser.add_argument( + "--loss_scale", + type=float, + default=0, + help="Loss scaling to improve fp16 numeric stability. Only used when fp16 set to True.\n" + "0 (default value): dynamic loss scaling.\n" + "Positive power of 2: static loss scaling value.\n", + ) + parser.add_argument( + "--num_workers", + type=int, + default=20, + help="Number of workers in the dataloader.", + ) + parser.add_argument( + "--from_pretrained", + action="store_true", + help="Wheter the tensor is from pretrained.", + ) + parser.add_argument( + "--save_name", + default='', + type=str, + help="save name for training.", + ) + + args = parser.parse_args() + + print(args) + if args.save_name is not '': + timeStamp = args.save_name + else: + timeStamp = strftime("%d-%b-%y-%X-%a", gmtime()) + timeStamp += "_{:0>6d}".format(random.randint(0, 10e6)) + + savePath = os.path.join(args.output_dir, timeStamp) + + if not os.path.exists(savePath): + os.makedirs(savePath) + + # save all the hidden parameters. + with open(os.path.join(savePath, 'command.txt'), 'w') as f: + print(args, file=f) # Python 3.x + print('\n', file=f) + + if args.local_rank == -1 or args.no_cuda: + device = torch.device( + "cuda" if torch.cuda.is_available() and not args.no_cuda else "cpu" + ) + n_gpu = torch.cuda.device_count() + else: + torch.cuda.set_device(args.local_rank) + device = torch.device("cuda", args.local_rank) + n_gpu = 1 + # Initializes the distributed backend which will take care of sychronizing nodes/GPUs + torch.distributed.init_process_group(backend="nccl") + logger.info( + "device: {} n_gpu: {}, distributed training: {}, 16-bits training: {}".format( + device, n_gpu, bool(args.local_rank != -1), args.fp16 + ) + ) + + if args.gradient_accumulation_steps < 1: + raise ValueError( + "Invalid gradient_accumulation_steps parameter: {}, should be >= 1".format( + args.gradient_accumulation_steps + ) + ) + + args.train_batch_size = args.train_batch_size // args.gradient_accumulation_steps + + random.seed(args.seed) + np.random.seed(args.seed) + torch.manual_seed(args.seed) + if n_gpu > 0: + torch.cuda.manual_seed_all(args.seed) + + if not args.do_train: + raise ValueError( + "Training is currently the only implemented execution option. Please set `do_train`." + ) + + if not os.path.exists(args.output_dir): + os.makedirs(args.output_dir) + + tokenizer = BertTokenizer.from_pretrained( + args.bert_model, do_lower_case=args.do_lower_case + ) + + num_train_optimization_steps = None + if args.do_train: + + viz = TBlogger("logs", timeStamp) + + train_dataset = ConceptCapLoaderTrain( + args.train_file, + tokenizer, + seq_len=args.max_seq_length, + batch_size=args.train_batch_size, + predict_feature=args.predict_feature, + num_workers=args.num_workers, + ) + + validation_dataset = ConceptCapLoaderVal( + args.validation_file, + tokenizer, + seq_len=args.max_seq_length, + batch_size=args.train_batch_size, + predict_feature=args.predict_feature, + num_workers=args.num_workers, + ) + + num_train_optimization_steps = ( + int( + train_dataset.num_dataset + / args.train_batch_size + / args.gradient_accumulation_steps + ) + * args.num_train_epochs + ) + if args.local_rank != -1: + num_train_optimization_steps = ( + num_train_optimization_steps // torch.distributed.get_world_size() + ) + + config = BertConfig.from_json_file(args.config_file) + + if args.from_pretrained: + model = BertForMultiModalPreTraining.from_pretrained(args.bert_model, config) + else: + model = BertForMultiModalPreTraining(config) + + if args.fp16: + model.half() + if args.local_rank != -1: + try: + from apex.parallel import DistributedDataParallel as DDP + except ImportError: + raise ImportError( + "Please install apex from https://www.github.com/nvidia/apex to use distributed and fp16 training." + ) + model = DDP(model) + elif n_gpu > 1: + model = torch.nn.DataParallel(model) + # model = torch.nn.parallel.DistributedDataParallel(model) + model.cuda() + no_decay = ["bias", "LayerNorm.bias", "LayerNorm.weight"] + + if not args.from_pretrained: + param_optimizer = list(model.named_parameters()) + optimizer_grouped_parameters = [ + { + "params": [ + p for n, p in param_optimizer if not any(nd in n for nd in no_decay) + ], + "weight_decay": 0.01, + }, + { + "params": [ + p for n, p in param_optimizer if any(nd in n for nd in no_decay) + ], + "weight_decay": 0.0, + }, + ] + else: + bert_weight_name = json.load(open("config/" + args.pretrained_weight + "_weight_name.json", "r")) + optimizer_grouped_parameters = [] + for key, value in dict(model.named_parameters()).items(): + if value.requires_grad: + if key[12:] in bert_weight_name: + lr = args.learning_rate * 0.1 + else: + lr = args.learning_rate + + if any(nd in key for nd in no_decay): + optimizer_grouped_parameters += [ + {"params": [value], "lr": lr, "weight_decay": 0.01} + ] + + if not any(nd in key for nd in no_decay): + optimizer_grouped_parameters += [ + {"params": [value], "lr": lr, "weight_decay": 0.0} + ] + + # set different parameters for vision branch and lanugage branch. + if args.fp16: + try: + from apex.optimizers import FP16_Optimizer + from apex.optimizers import FusedAdam + except ImportError: + raise ImportError( + "Please install apex from https://www.github.com/nvidia/apex to use distributed and fp16 training." + ) + + optimizer = FusedAdam( + optimizer_grouped_parameters, + lr=args.learning_rate, + bias_correction=False, + max_grad_norm=1.0, + ) + if args.loss_scale == 0: + optimizer = FP16_Optimizer(optimizer, dynamic_loss_scale=True) + else: + optimizer = FP16_Optimizer(optimizer, static_loss_scale=args.loss_scale) + + else: + if args.from_pretrained: + optimizer = BertAdam( + optimizer_grouped_parameters, + warmup=args.warmup_proportion, + t_total=num_train_optimization_steps, + ) + + else: + optimizer = BertAdam( + optimizer_grouped_parameters, + lr=args.learning_rate, + warmup=args.warmup_proportion, + t_total=num_train_optimization_steps, + ) + + if args.do_train: + logger.info("***** Running training *****") + logger.info(" Num examples = %d", train_dataset.num_dataset) + logger.info(" Batch size = %d", args.train_batch_size) + logger.info(" Num steps = %d", num_train_optimization_steps) + + startIterID = 0 + global_step = 0 + masked_loss_v_tmp = 0 + masked_loss_t_tmp = 0 + next_sentence_loss_tmp = 0 + loss_tmp = 0 + start_t = timer() + + model.train() + # t1 = timer() + for epochId in trange(int(args.num_train_epochs), desc="Epoch"): + tr_loss = 0 + nb_tr_examples, nb_tr_steps = 0, 0 + + # iter_dataloader = iter(train_dataloader) + for step, batch in enumerate(train_dataset): + iterId = startIterID + step + (epochId * len(train_dataset)) + batch = tuple(t.cuda(device=device, non_blocking=True) for t in batch) + + input_ids, input_mask, segment_ids, lm_label_ids, is_next, image_feat, image_loc, \ + image_target, image_label, image_mask, image_ids = (batch) + + masked_loss_t, masked_loss_v, next_sentence_loss = model( + input_ids, + image_feat, + image_target, + image_loc, + segment_ids, + input_mask, + image_mask, + lm_label_ids, + image_label, + is_next, + ) + + masked_loss_v = masked_loss_v * args.img_weight + loss = masked_loss_t + masked_loss_v + next_sentence_loss + + if n_gpu > 1: + loss = loss.mean() # mean() to average on multi-gpu. + masked_loss_t = masked_loss_t.mean() + masked_loss_v = masked_loss_v.mean() + next_sentence_loss = next_sentence_loss.mean() + if args.gradient_accumulation_steps > 1: + loss = loss / args.gradient_accumulation_steps + if args.fp16: + optimizer.backward(loss) + else: + loss.backward() + + if math.isnan(loss.item()): + pdb.set_trace() + + tr_loss += loss.item() + + # print(tr_loss) + viz.linePlot(iterId, loss.item(), "loss", "train") + viz.linePlot(iterId, masked_loss_t.item(), "masked_loss_t", "train") + viz.linePlot(iterId, masked_loss_v.item(), "masked_loss_v", "train") + viz.linePlot( + iterId, next_sentence_loss.item(), "next_sentence_loss", "train" + ) + # viz.linePlot(iterId, optimizer.get_lr()[0], 'learning_rate', 'train') + + loss_tmp += loss.item() + masked_loss_v_tmp += masked_loss_v.item() + masked_loss_t_tmp += masked_loss_t.item() + next_sentence_loss_tmp += next_sentence_loss.item() + + nb_tr_examples += input_ids.size(0) + nb_tr_steps += 1 + if (step + 1) % args.gradient_accumulation_steps == 0: + if args.fp16: + # modify learning rate with special warm up BERT uses + # if args.fp16 is False, BertAdam is used that handles this automatically + lr_this_step = args.learning_rate * warmup_linear( + global_step / num_train_optimization_steps, + args.warmup_proportion, + ) + for param_group in optimizer.param_groups: + param_group["lr"] = lr_this_step + + optimizer.step() + optimizer.zero_grad() + global_step += 1 + + if step % 20 == 0 and step != 0: + masked_loss_t_tmp = masked_loss_t_tmp / 20.0 + masked_loss_v_tmp = masked_loss_v_tmp / 20.0 + next_sentence_loss_tmp = next_sentence_loss_tmp / 20.0 + loss_tmp = loss_tmp / 20.0 + + end_t = timer() + timeStamp = strftime("%a %d %b %y %X", gmtime()) + + Ep = epochId + nb_tr_steps / float(len(train_dataset)) + printFormat = "[%s][Ep: %.2f][Iter: %d][Time: %5.2fs][Loss: %.5g][Loss_v: %.5g][Loss_t: %.5g][Loss_n: %.5g][LR: %.5g]" + + printInfo = [ + timeStamp, + Ep, + nb_tr_steps, + end_t - start_t, + loss_tmp, + masked_loss_v_tmp, + masked_loss_t_tmp, + next_sentence_loss_tmp, + optimizer.get_lr()[0], + ] + + start_t = end_t + print(printFormat % tuple(printInfo)) + + masked_loss_v_tmp = 0 + masked_loss_t_tmp = 0 + next_sentence_loss_tmp = 0 + loss_tmp = 0 + + # Do the evaluation + torch.set_grad_enabled(False) + start_t = timer() + numBatches = len(validation_dataset) + eval_masked_loss_t = 0 + eval_masked_loss_v = 0 + eval_next_sentence_loss = 0 + eval_total_loss = 0 + + model.eval() + for step, batch in enumerate(validation_dataset): + batch = tuple(t.cuda(device=device, non_blocking=True) for t in batch) + + input_ids, input_mask, segment_ids, lm_label_ids, is_next, image_feat, image_loc, image_target, image_label, image_mask, image_ids = ( + batch + ) + + masked_loss_t, masked_loss_v, next_sentence_loss = model( + input_ids, + image_feat, + image_target, + image_loc, + segment_ids, + input_mask, + image_mask, + lm_label_ids, + image_label, + is_next, + ) + + masked_loss_v = masked_loss_v * args.img_weight + loss = masked_loss_t + masked_loss_v + next_sentence_loss + + if n_gpu > 1: + loss = loss.mean() # mean() to average on multi-gpu. + masked_loss_t = masked_loss_t.mean() + masked_loss_v = masked_loss_v.mean() + next_sentence_loss = next_sentence_loss.mean() + + + eval_masked_loss_t += masked_loss_t.item() + eval_masked_loss_v += masked_loss_v.item() + eval_next_sentence_loss += next_sentence_loss.item() + eval_total_loss += loss.item() + + end_t = timer() + delta_t = " Time: %5.2fs" % (end_t - start_t) + start_t = end_t + progressString = "\r Evaluating split '%s' [%d/%d]\t" + delta_t + sys.stdout.write(progressString % ('val', step + 1, numBatches)) + sys.stdout.flush() + + eval_masked_loss_t = eval_masked_loss_t / float(numBatches) + eval_masked_loss_v = eval_masked_loss_v / float(numBatches) + eval_next_sentence_loss = eval_next_sentence_loss / float(numBatches) + eval_total_loss = eval_total_loss / float(numBatches) + + printFormat = "Evaluation: [Loss: %.5g][Loss_v: %.5g][Loss_t: %.5g][Loss_n: %.5g]" + printInfo = [ + eval_total_loss, + eval_masked_loss_t, + eval_masked_loss_v, + eval_next_sentence_loss] + + print(printFormat % tuple(printInfo)) + + torch.set_grad_enabled(True) + + viz.linePlot(epochId, eval_total_loss, "loss", "val") + viz.linePlot(epochId, eval_masked_loss_t, "masked_loss_t", "val") + viz.linePlot(epochId, eval_masked_loss_v, "masked_loss_v", "val") + viz.linePlot(epochId, eval_next_sentence_loss, "next_sentence_loss", "val") + + # Save a trained model + logger.info("** ** * Saving fine - tuned model ** ** * ") + model_to_save = ( + model.module if hasattr(model, "module") else model + ) # Only save the model it-self + + output_model_file = os.path.join( + savePath, "pytorch_model_" + str(epochId) + ".bin" + ) + if args.do_train: + torch.save(model_to_save.state_dict(), output_model_file) + +class TBlogger: + def __init__(self, log_dir, exp_name): + log_dir = log_dir + "/" + exp_name + print("logging file at: " + log_dir) + self.logger = SummaryWriter(log_dir=log_dir) + + def linePlot(self, step, val, split, key, xlabel="None"): + self.logger.add_scalar(split + "/" + key, val, step) + + +if __name__ == "__main__": + + main() diff --git a/train_concap.py b/train_concap.py new file mode 100644 index 0000000..5a93e41 --- /dev/null +++ b/train_concap.py @@ -0,0 +1,672 @@ +import argparse +import json +import logging +import os +import random +from io import open +import math +import sys + +from time import gmtime, strftime +from timeit import default_timer as timer + +import numpy as np +from tqdm import tqdm, trange + +import torch +from torch.utils.data import DataLoader, Dataset, RandomSampler +from torch.utils.data.distributed import DistributedSampler +# from parallel.data_parallel import DataParallel +from tensorboardX import SummaryWriter + +from pytorch_pretrained_bert.tokenization import BertTokenizer +from pytorch_pretrained_bert.optimization import BertAdam, WarmupLinearSchedule + +from vilbert.datasets import ConceptCapLoaderTrain, ConceptCapLoaderVal +from vilbert.vilbert import BertForMultiModalPreTraining, BertConfig +import torch.distributed as dist + +import pdb + +logging.basicConfig( + format="%(asctime)s - %(levelname)s - %(name)s - %(message)s", + datefmt="%m/%d/%Y %H:%M:%S", + level=logging.INFO, +) +logger = logging.getLogger(__name__) + + +def main(): + parser = argparse.ArgumentParser() + + # Required parameters + parser.add_argument( + "--train_file", + default="data/conceptual_caption/training", + type=str, + # required=True, + help="The input train corpus.", + ) + parser.add_argument( + "--validation_file", + default="data/conceptual_caption/validation", + type=str, + # required=True, + help="The input train corpus.", + ) + parser.add_argument( + "--from_pretrained", + default="", + type=str, + help="Bert pre-trained model selected in the list: bert-base-uncased, " + "bert-large-uncased, bert-base-cased, bert-base-multilingual, bert-base-chinese.", + ) + parser.add_argument( + "--bert_model", + default="bert-base-uncased", + type=str, + help="Bert pre-trained model selected in the list: bert-base-uncased, " + "bert-large-uncased, bert-base-cased, bert-base-multilingual, bert-base-chinese.", + ) + parser.add_argument( + "--output_dir", + default="save", + type=str, + # required=True, + help="The output directory where the model checkpoints will be written.", + ) + + parser.add_argument( + "--config_file", + default="config/bert_config.json", + type=str, + # required=True, + help="The config file which specified the model details.", + ) + ## Other parameters + parser.add_argument( + "--max_seq_length", + default=36, + type=int, + help="The maximum total input sequence length after WordPiece tokenization. \n" + "Sequences longer than this will be truncated, and sequences shorter \n" + "than this will be padded.", + ) + parser.add_argument("--predict_feature", action="store_true", help="visual target.") + + parser.add_argument( + "--train_batch_size", + default=512, + type=int, + help="Total batch size for training.", + ) + parser.add_argument( + "--learning_rate", + default=1e-4, + type=float, + help="The initial learning rate for Adam.", + ) + parser.add_argument( + "--num_train_epochs", + default=10.0, + type=float, + help="Total number of training epochs to perform.", + ) + parser.add_argument( + "--start_epoch", + default=0, + type=float, + help="Total number of training epochs to perform.", + ) + parser.add_argument( + "--warmup_proportion", + default=0.1, + type=float, + help="Proportion of training to perform linear learning rate warmup for. " + "E.g., 0.1 = 10%% of training.", + ) + parser.add_argument( + "--img_weight", default=1, type=float, help="weight for image loss" + ) + parser.add_argument( + "--no_cuda", action="store_true", help="Whether not to use CUDA when available" + ) + parser.add_argument( + "--on_memory", + action="store_true", + help="Whether to load train samples into memory or use disk", + ) + parser.add_argument( + "--do_lower_case", + type=bool, + default=True, + help="Whether to lower case the input text. True for uncased models, False for cased models.", + ) + parser.add_argument( + "--local_rank", + type=int, + default=-1, + help="local_rank for distributed training on gpus", + ) + parser.add_argument( + "--seed", type=int, default=42, help="random seed for initialization" + ) + parser.add_argument( + "--gradient_accumulation_steps", + type=int, + default=1, + help="Number of updates steps to accumualte before performing a backward/update pass.", + ) + parser.add_argument( + "--fp16", + action="store_true", + help="Whether to use 16-bit float precision instead of 32-bit", + ) + parser.add_argument( + "--loss_scale", + type=float, + default=0, + help="Loss scaling to improve fp16 numeric stability. Only used when fp16 set to True.\n" + "0 (default value): dynamic loss scaling.\n" + "Positive power of 2: static loss scaling value.\n", + ) + parser.add_argument( + "--num_workers", + type=int, + default=3, + help="Number of workers in the dataloader.", + ) + + parser.add_argument( + "--save_name", + default='', + type=str, + help="save name for training.", + ) + parser.add_argument( + "--baseline", action="store_true", help="Wheter to use the baseline model (single bert)." + ) + parser.add_argument( + "--freeze", default = -1, type=int, + help="till which layer of textual stream of vilbert need to fixed." + ) + parser.add_argument( + "--use_chuncks", default=0, type=float, help="whether use chunck for parallel training." + ) + parser.add_argument( + "--distributed", action="store_true" , help="whether use chunck for parallel training." + ) + parser.add_argument( + "--without_coattention", action="store_true" , help="whether pair loss." + ) + args = parser.parse_args() + if args.baseline: + from pytorch_pretrained_bert.modeling import BertConfig + from vilbert.basebert import BertForMultiModalPreTraining + else: + from vilbert.vilbert import BertForMultiModalPreTraining, BertConfig + + print(args) + if args.save_name is not '': + timeStamp = args.save_name + else: + timeStamp = strftime("%d-%b-%y-%X-%a", gmtime()) + timeStamp += "_{:0>6d}".format(random.randint(0, 10e6)) + + savePath = os.path.join(args.output_dir, timeStamp) + + if not os.path.exists(savePath): + os.makedirs(savePath) + + config = BertConfig.from_json_file(args.config_file) + + if args.freeze > config.t_biattention_id[0]: + config.fixed_t_layer = config.t_biattention_id[0] + + if args.without_coattention: + config.with_coattention = False + # save all the hidden parameters. + with open(os.path.join(savePath, 'command.txt'), 'w') as f: + print(args, file=f) # Python 3.x + print('\n', file=f) + print(config, file=f) + + bert_weight_name = json.load(open("config/" + args.from_pretrained + "_weight_name.json", "r")) + if args.local_rank == -1 or args.no_cuda: + device = torch.device( + "cuda" if torch.cuda.is_available() and not args.no_cuda else "cpu" + ) + n_gpu = torch.cuda.device_count() + else: + torch.cuda.set_device(args.local_rank) + device = torch.device("cuda", args.local_rank) + n_gpu = 1 + # Initializes the distributed backend which will take care of sychronizing nodes/GPUs + torch.distributed.init_process_group(backend="nccl") + logger.info( + "device: {} n_gpu: {}, distributed training: {}, 16-bits training: {}".format( + device, n_gpu, bool(args.local_rank != -1), args.fp16 + ) + ) + + if args.gradient_accumulation_steps < 1: + raise ValueError( + "Invalid gradient_accumulation_steps parameter: {}, should be >= 1".format( + args.gradient_accumulation_steps + ) + ) + + args.train_batch_size = args.train_batch_size // args.gradient_accumulation_steps + + random.seed(args.seed) + np.random.seed(args.seed) + torch.manual_seed(args.seed) + if n_gpu > 0: + torch.cuda.manual_seed_all(args.seed) + + if not os.path.exists(args.output_dir): + os.makedirs(args.output_dir) + + tokenizer = BertTokenizer.from_pretrained( + args.bert_model, do_lower_case=args.do_lower_case + ) + + num_train_optimization_steps = None + + viz = TBlogger("logs", timeStamp) + + train_dataset = ConceptCapLoaderTrain( + args.train_file, + tokenizer, + seq_len=args.max_seq_length, + batch_size=args.train_batch_size, + predict_feature=args.predict_feature, + num_workers=args.num_workers, + distributed=args.distributed, + ) + + validation_dataset = ConceptCapLoaderVal( + args.validation_file, + tokenizer, + seq_len=args.max_seq_length, + batch_size=args.train_batch_size, + predict_feature=args.predict_feature, + num_workers=2, + distributed=args.distributed, + ) + + num_train_optimization_steps = ( + int( + train_dataset.num_dataset + / args.train_batch_size + / args.gradient_accumulation_steps + ) + * (args.num_train_epochs - args.start_epoch) + ) + # if args.local_rank != -1: + # num_train_optimization_steps = ( + # num_train_optimization_steps // torch.distributed.get_world_size() + # ) + + default_gpu = False + if dist.is_available() and args.distributed: + rank = dist.get_rank() + if rank == 0: + default_gpu = True + else: + default_gpu = True + + # pdb.set_trace() + if args.predict_feature: + config.v_target_size = 2048 + config.predict_feature = True + else: + config.v_target_size = 1601 + config.predict_feature = False + + if args.from_pretrained: + model = BertForMultiModalPreTraining.from_pretrained(args.from_pretrained, config) + else: + model = BertForMultiModalPreTraining(config) + + model.cuda() + + if args.fp16: + model.half() + if args.local_rank != -1: + try: + from apex.parallel import DistributedDataParallel as DDP + except ImportError: + raise ImportError( + "Please install apex from https://www.github.com/nvidia/apex to use distributed and fp16 training." + ) + model = DDP(model) + elif n_gpu > 1: + model = torch.nn.DataParallel(model) + + no_decay = ["bias", "LayerNorm.bias", "LayerNorm.weight"] + + if args.freeze != -1: + bert_weight_name_filtered = [] + for name in bert_weight_name: + if 'embeddings' in name: + bert_weight_name_filtered.append(name) + elif 'encoder' in name: + layer_num = name.split('.')[2] + if int(layer_num) <= args.freeze: + bert_weight_name_filtered.append(name) + + optimizer_grouped_parameters = [] + for key, value in dict(model.named_parameters()).items(): + if key[12:] in bert_weight_name_filtered: + value.requires_grad = False + + if default_gpu: + print("filtered weight") + print(bert_weight_name_filtered) + + if not args.from_pretrained: + param_optimizer = list(model.named_parameters()) + optimizer_grouped_parameters = [ + { + "params": [ + p for n, p in param_optimizer if not any(nd in n for nd in no_decay) + ], + "weight_decay": 0.01, + }, + { + "params": [ + p for n, p in param_optimizer if any(nd in n for nd in no_decay) + ], + "weight_decay": 0.0, + }, + ] + else: + optimizer_grouped_parameters = [] + for key, value in dict(model.named_parameters()).items(): + if value.requires_grad: + if key[12:] in bert_weight_name: + lr = args.learning_rate * 0.1 + else: + lr = args.learning_rate + + if any(nd in key for nd in no_decay): + optimizer_grouped_parameters += [ + {"params": [value], "lr": lr, "weight_decay": 0.01} + ] + + if not any(nd in key for nd in no_decay): + optimizer_grouped_parameters += [ + {"params": [value], "lr": lr, "weight_decay": 0.0} + ] + if default_gpu: + print(len(list(model.named_parameters())), len(optimizer_grouped_parameters)) + + # set different parameters for vision branch and lanugage branch. + if args.fp16: + try: + from apex.optimizers import FP16_Optimizer + from apex.optimizers import FusedAdam + except ImportError: + raise ImportError( + "Please install apex from https://www.github.com/nvidia/apex to use distributed and fp16 training." + ) + + optimizer = FusedAdam( + optimizer_grouped_parameters, + lr=args.learning_rate, + bias_correction=False, + max_grad_norm=1.0, + ) + if args.loss_scale == 0: + optimizer = FP16_Optimizer(optimizer, dynamic_loss_scale=True) + else: + optimizer = FP16_Optimizer(optimizer, static_loss_scale=args.loss_scale) + + else: + if args.from_pretrained: + optimizer = BertAdam( + optimizer_grouped_parameters, + warmup=args.warmup_proportion, + t_total=num_train_optimization_steps, + + ) + + else: + optimizer = BertAdam( + optimizer_grouped_parameters, + lr=args.learning_rate, + warmup=args.warmup_proportion, + t_total=num_train_optimization_steps, + ) + + logger.info("***** Running training *****") + logger.info(" Num examples = %d", train_dataset.num_dataset) + logger.info(" Batch size = %d", args.train_batch_size) + logger.info(" Num steps = %d", num_train_optimization_steps) + + startIterID = 0 + global_step = 0 + masked_loss_v_tmp = 0 + masked_loss_t_tmp = 0 + next_sentence_loss_tmp = 0 + loss_tmp = 0 + start_t = timer() + + # t1 = timer() + for epochId in range(int(args.start_epoch), int(args.num_train_epochs)): + model.train() + tr_loss = 0 + nb_tr_examples, nb_tr_steps = 0, 0 + + # iter_dataloader = iter(train_dataloader) + for step, batch in enumerate(train_dataset): + iterId = startIterID + step + (epochId * len(train_dataset)) + # batch = iter_dataloader.next() + batch = tuple(t.cuda(device=device, non_blocking=True) for t in batch) + + input_ids, input_mask, segment_ids, lm_label_ids, is_next, image_feat, image_loc, image_target, image_label, image_mask, image_ids = ( + batch + ) + + masked_loss_t, masked_loss_v, next_sentence_loss = model( + input_ids, + image_feat, + image_loc, + segment_ids, + input_mask, + image_mask, + lm_label_ids, + image_label, + image_target, + is_next, + ) + + if args.without_coattention: + next_sentence_loss = next_sentence_loss * 0 + + masked_loss_v = masked_loss_v * args.img_weight + loss = masked_loss_t + masked_loss_v + next_sentence_loss + + if n_gpu > 1: + loss = loss.mean() # mean() to average on multi-gpu. + masked_loss_t = masked_loss_t.mean() + masked_loss_v = masked_loss_v.mean() + next_sentence_loss = next_sentence_loss.mean() + if args.gradient_accumulation_steps > 1: + loss = loss / args.gradient_accumulation_steps + if args.fp16: + optimizer.backward(loss) + else: + loss.backward() + + if math.isnan(loss.item()): + pdb.set_trace() + + tr_loss += loss.item() + + rank = 0 + + if dist.is_available() and args.distributed: + rank = dist.get_rank() + else: + rank = 0 + + viz.linePlot(iterId, loss.item(), "loss_"+str(rank), "train") + viz.linePlot(iterId, masked_loss_t.item(), "masked_loss_t_"+str(rank), "train") + viz.linePlot(iterId, masked_loss_v.item(), "masked_loss_v_"+str(rank), "train") + viz.linePlot( + iterId, next_sentence_loss.item(), "next_sentence_loss_"+str(rank), "train" + ) + # viz.linePlot(iterId, optimizer.get_lr()[0], 'learning_rate', 'train') + + loss_tmp += loss.item() + masked_loss_v_tmp += masked_loss_v.item() + masked_loss_t_tmp += masked_loss_t.item() + next_sentence_loss_tmp += next_sentence_loss.item() + + nb_tr_examples += input_ids.size(0) + nb_tr_steps += 1 + if (step + 1) % args.gradient_accumulation_steps == 0: + if args.fp16: + # modify learning rate with special warm up BERT uses + # if args.fp16 is False, BertAdam is used that handles this automatically + lr_this_step = args.learning_rate * warmup_linear( + global_step / num_train_optimization_steps, + args.warmup_proportion, + ) + for param_group in optimizer.param_groups: + param_group["lr"] = lr_this_step + + optimizer.step() + optimizer.zero_grad() + global_step += 1 + + if step % 20 == 0 and step != 0: + masked_loss_t_tmp = masked_loss_t_tmp / 20.0 + masked_loss_v_tmp = masked_loss_v_tmp / 20.0 + next_sentence_loss_tmp = next_sentence_loss_tmp / 20.0 + loss_tmp = loss_tmp / 20.0 + + end_t = timer() + timeStamp = strftime("%a %d %b %y %X", gmtime()) + + Ep = epochId + nb_tr_steps / float(len(train_dataset)) + printFormat = "[%s][Ep: %.2f][Iter: %d][Time: %5.2fs][Loss: %.5g][Loss_v: %.5g][Loss_t: %.5g][Loss_n: %.5g][LR: %.8g]" + + printInfo = [ + timeStamp, + Ep, + nb_tr_steps, + end_t - start_t, + loss_tmp, + masked_loss_v_tmp, + masked_loss_t_tmp, + next_sentence_loss_tmp, + optimizer.get_lr()[0], + ] + + start_t = end_t + print(printFormat % tuple(printInfo)) + + masked_loss_v_tmp = 0 + masked_loss_t_tmp = 0 + next_sentence_loss_tmp = 0 + loss_tmp = 0 + + # Do the evaluation + torch.set_grad_enabled(False) + start_t = timer() + numBatches = len(validation_dataset) + eval_masked_loss_t = 0 + eval_masked_loss_v = 0 + eval_next_sentence_loss = 0 + eval_total_loss = 0 + + model.eval() + for step, batch in enumerate(validation_dataset): + batch = tuple(t.cuda(device=device, non_blocking=True) for t in batch) + + input_ids, input_mask, segment_ids, lm_label_ids, is_next, image_feat, image_loc, image_target, image_label, image_mask, image_ids = ( + batch + ) + + masked_loss_t, masked_loss_v, next_sentence_loss = model( + input_ids, + image_feat, + image_loc, + segment_ids, + input_mask, + image_mask, + lm_label_ids, + image_label, + image_target, + is_next, + ) + + masked_loss_v = masked_loss_v * args.img_weight + loss = masked_loss_t + masked_loss_v + next_sentence_loss + + if n_gpu > 1: + loss = loss.mean() # mean() to average on multi-gpu. + masked_loss_t = masked_loss_t.mean() + masked_loss_v = masked_loss_v.mean() + next_sentence_loss = next_sentence_loss.mean() + + + eval_masked_loss_t += masked_loss_t.item() + eval_masked_loss_v += masked_loss_v.item() + eval_next_sentence_loss += next_sentence_loss.item() + eval_total_loss += loss.item() + + end_t = timer() + delta_t = " Time: %5.2fs" % (end_t - start_t) + start_t = end_t + progressString = "\r Evaluating split '%s' [%d/%d]\t" + delta_t + sys.stdout.write(progressString % ('val', step + 1, numBatches)) + sys.stdout.flush() + + eval_masked_loss_t = eval_masked_loss_t / float(numBatches) + eval_masked_loss_v = eval_masked_loss_v / float(numBatches) + eval_next_sentence_loss = eval_next_sentence_loss / float(numBatches) + eval_total_loss = eval_total_loss / float(numBatches) + + printFormat = "Evaluation: [Loss: %.5g][Loss_v: %.5g][Loss_t: %.5g][Loss_n: %.5g]" + printInfo = [ + eval_total_loss, + eval_masked_loss_v, + eval_masked_loss_t, + eval_next_sentence_loss] + + print(printFormat % tuple(printInfo)) + torch.set_grad_enabled(True) + + viz.linePlot(epochId, eval_total_loss, "loss_" + str(rank), "val") + viz.linePlot(epochId, eval_masked_loss_t, "masked_loss_t_" + str(rank), "val") + viz.linePlot(epochId, eval_masked_loss_v, "masked_loss_v_" + str(rank), "val") + viz.linePlot(epochId, eval_next_sentence_loss, "next_sentence_loss_" + str(rank), "val") + + if default_gpu: + # Save a trained model + logger.info("** ** * Saving fine - tuned model ** ** * ") + model_to_save = ( + model.module if hasattr(model, "module") else model + ) # Only save the model it-self + output_model_file = os.path.join( + savePath, "pytorch_model_" + str(epochId) + ".bin" + ) + + torch.save(model_to_save.state_dict(), output_model_file) + +class TBlogger: + def __init__(self, log_dir, exp_name): + log_dir = log_dir + "/" + exp_name + print("logging file at: " + log_dir) + self.logger = SummaryWriter(log_dir=log_dir) + + def linePlot(self, step, val, split, key, xlabel="None"): + self.logger.add_scalar(split + "/" + key, val, step) + +if __name__ == "__main__": + + main() diff --git a/train_tasks.py b/train_tasks.py new file mode 100644 index 0000000..e6ead24 --- /dev/null +++ b/train_tasks.py @@ -0,0 +1,432 @@ +import argparse +import json +import logging +import os +import random +from io import open +import numpy as np + +from tensorboardX import SummaryWriter +from tqdm import tqdm +from bisect import bisect +import yaml +from easydict import EasyDict as edict + +import pdb +import sys +import torch +import torch.nn.functional as F +import torch.nn as nn + +from pytorch_pretrained_bert.optimization import WarmupLinearSchedule + +# from parallel.parallel import DataParallelModel, DataParallelCriterion + +from vilbert.task_utils import LoadDatasets, LoadLosses, ForwardModelsTrain, ForwardModelsVal +from vilbert.optimization import BertAdam, Adam, Adamax +from torch.optim.lr_scheduler import LambdaLR, ReduceLROnPlateau + +import vilbert.utils as utils +import torch.distributed as dist + +logging.basicConfig( + format="%(asctime)s - %(levelname)s - %(name)s - %(message)s", + datefmt="%m/%d/%Y %H:%M:%S", + level=logging.INFO, +) +logger = logging.getLogger(__name__) + +def main(): + parser = argparse.ArgumentParser() + + parser.add_argument( + "--bert_model", + default="bert-base-uncased", + type=str, + help="Bert pre-trained model selected in the list: bert-base-uncased, " + "bert-large-uncased, bert-base-cased, bert-base-multilingual, bert-base-chinese.", + ) + parser.add_argument( + "--from_pretrained", + default="bert-base-uncased", + type=str, + help="Bert pre-trained model selected in the list: bert-base-uncased, " + "bert-large-uncased, bert-base-cased, bert-base-multilingual, bert-base-chinese.", + ) + parser.add_argument( + "--output_dir", + default="save", + type=str, + help="The output directory where the model checkpoints will be written.", + ) + parser.add_argument( + "--config_file", + default="config/bert_config.json", + type=str, + help="The config file which specified the model details.", + ) + parser.add_argument( + "--learning_rate", default=2e-5, type=float, help="The initial learning rate for Adam." + ) + parser.add_argument( + "--num_train_epochs", + default=20, + type=int, + help="Total number of training epochs to perform.", + ) + parser.add_argument( + "--warmup_proportion", + default=0.1, + type=float, + help="Proportion of training to perform linear learning rate warmup for. " + "E.g., 0.1 = 10%% of training.", + ) + parser.add_argument( + "--no_cuda", action="store_true", help="Whether not to use CUDA when available" + ) + parser.add_argument( + "--do_lower_case", + default=True, + type=bool, + help="Whether to lower case the input text. True for uncased models, False for cased models.", + ) + parser.add_argument( + "--local_rank", type=int, default=-1, help="local_rank for distributed training on gpus" + ) + parser.add_argument("--seed", type=int, default=0, help="random seed for initialization") + parser.add_argument( + "--gradient_accumulation_steps", + type=int, + default=1, + help="Number of updates steps to accumualte before performing a backward/update pass.", + ) + parser.add_argument( + "--fp16", + action="store_true", + help="Whether to use 16-bit float precision instead of 32-bit", + ) + parser.add_argument( + "--loss_scale", + type=float, + default=0, + help="Loss scaling to improve fp16 numeric stability. Only used when fp16 set to True.\n" + "0 (default value): dynamic loss scaling.\n" + "Positive power of 2: static loss scaling value.\n", + ) + parser.add_argument( + "--num_workers", type=int, default=16, help="Number of workers in the dataloader." + ) + parser.add_argument( + "--save_name", + default='', + type=str, + help="save name for training.", + ) + parser.add_argument( + "--use_chunk", default=0, type=float, help="whether use chunck for parallel training." + ) + parser.add_argument( + "--in_memory", default=False, type=bool, help="whether use chunck for parallel training." + ) + parser.add_argument( + "--optimizer", default='BertAdam', type=str, help="whether use chunck for parallel training." + ) + parser.add_argument( + "--tasks", default='', type=str, help="1-2-3... training task separate by -" + ) + parser.add_argument( + "--freeze", default = -1, type=int, + help="till which layer of textual stream of vilbert need to fixed." + ) + parser.add_argument( + "--vision_scratch", action="store_true", help="whether pre-trained the image or not." + ) + parser.add_argument( + "--evaluation_interval", default=1, type=int, help="evaluate very n epoch." + ) + parser.add_argument( + "--lr_scheduler", default='mannul', type=str, help="whether use learning rate scheduler." + ) + parser.add_argument( + "--baseline", action="store_true", help="whether use single stream baseline." + ) + parser.add_argument( + "--compact", action="store_true", help="whether use compact vilbert model." + ) + args = parser.parse_args() + with open('vlbert_tasks.yml', 'r') as f: + task_cfg = edict(yaml.load(f)) + + # random.seed(args.seed) + # np.random.seed(args.seed) + # torch.manual_seed(args.seed) + + if args.baseline: + from pytorch_pretrained_bert.modeling import BertConfig + from vilbert.basebert import BaseBertForVLTasks + elif args.compact: + from vilbert.vilbert_compact import BertConfig + from vilbert.vilbert_compact import VILBertForVLTasks + else: + from vilbert.vilbert import BertConfig + from vilbert.vilbert import VILBertForVLTasks + + task_names = [] + task_lr = [] + for i, task_id in enumerate(args.tasks.split('-')): + task = 'TASK' + task_id + name = task_cfg[task]['name'] + task_names.append(name) + task_lr.append(task_cfg[task]['lr']) + + base_lr = min(task_lr) + loss_scale = {} + for i, task_id in enumerate(args.tasks.split('-')): + task = 'TASK' + task_id + loss_scale[task] = task_lr[i] / base_lr + + if args.save_name: + prefix = '-' + args.save_name + else: + prefix = '' + timeStamp = '-'.join(task_names) + '_' + args.config_file.split('/')[1].split('.')[0] + prefix + savePath = os.path.join(args.output_dir, timeStamp) + + bert_weight_name = json.load(open("config/" + args.bert_model + "_weight_name.json", "r")) + + if args.local_rank == -1 or args.no_cuda: + device = torch.device("cuda" if torch.cuda.is_available() and not args.no_cuda else "cpu") + n_gpu = torch.cuda.device_count() + else: + torch.cuda.set_device(args.local_rank) + device = torch.device("cuda", args.local_rank) + n_gpu = 1 + # Initializes the distributed backend which will take care of sychronizing nodes/GPUs + torch.distributed.init_process_group(backend="nccl") + + logger.info( + "device: {} n_gpu: {}, distributed training: {}, 16-bits training: {}".format( + device, n_gpu, bool(args.local_rank != -1), args.fp16 + ) + ) + + default_gpu = False + if dist.is_available() and args.local_rank != -1: + rank = dist.get_rank() + if rank == 0: + default_gpu = True + else: + default_gpu = True + + if default_gpu: + if not os.path.exists(savePath): + os.makedirs(savePath) + + config = BertConfig.from_json_file(args.config_file) + if default_gpu: + # save all the hidden parameters. + with open(os.path.join(savePath, 'command.txt'), 'w') as f: + print(args, file=f) # Python 3.x + print('\n', file=f) + print(config, file=f) + + task_batch_size, task_num_iters, task_ids, task_datasets_train, task_datasets_val, \ + task_dataloader_train, task_dataloader_val = LoadDatasets(args, task_cfg, args.tasks.split('-')) + + tbLogger = utils.tbLogger(timeStamp, savePath, task_names, task_ids, task_num_iters, args.gradient_accumulation_steps) + + # if n_gpu > 0: + # torch.cuda.manual_seed_all(args.seed) + + if not os.path.exists(args.output_dir): + os.makedirs(args.output_dir) + + num_train_optimization_steps = max(task_num_iters.values()) * args.num_train_epochs // args.gradient_accumulation_steps + num_labels = max([dataset.num_labels for dataset in task_datasets_train.values()]) + + task_start_iter = {} + task_interval = {} + for task_id, num_iter in task_num_iters.items(): + task_start_iter[task_id] = num_train_optimization_steps - (task_cfg[task]['num_epoch'] * num_iter // args.gradient_accumulation_steps) + task_interval[task_id] = num_train_optimization_steps // (task_cfg[task]['num_epoch'] * num_iter // args.gradient_accumulation_steps) + + if args.baseline: + model = BaseBertForVLTasks.from_pretrained( + args.from_pretrained, config, num_labels=num_labels, default_gpu=default_gpu + ) + else: + model = VILBertForVLTasks.from_pretrained( + args.from_pretrained, config, num_labels=num_labels, default_gpu=default_gpu + ) + + task_losses = LoadLosses(args, task_cfg, args.tasks.split('-')) + model.to(device) + if args.local_rank != -1: + try: + from apex.parallel import DistributedDataParallel as DDP + except ImportError: + raise ImportError( + "Please install apex from https://www.github.com/nvidia/apex to use distributed and fp16 training." + ) + model = DDP(model, delay_allreduce=True) + + elif n_gpu > 1: + model = torch.nn.DataParallel(model) + + no_decay = ["bias", "LayerNorm.bias", "LayerNorm.weight"] + + if args.freeze != -1: + bert_weight_name_filtered = [] + for name in bert_weight_name: + if 'embeddings' in name: + bert_weight_name_filtered.append(name) + elif 'encoder' in name: + layer_num = name.split('.')[2] + if int(layer_num) <= args.freeze: + bert_weight_name_filtered.append(name) + + optimizer_grouped_parameters = [] + for key, value in dict(model.named_parameters()).items(): + if key[12:] in bert_weight_name_filtered: + value.requires_grad = False + + if default_gpu: + print("filtered weight") + print(bert_weight_name_filtered) + + optimizer_grouped_parameters = [] + lr = args.learning_rate + for key, value in dict(model.named_parameters()).items(): + if value.requires_grad: + if 'vil_prediction' in key: + # if args.learning_rate <= 2e-5: + lr = 1e-4 + else: + if args.vision_scratch: + if key[12:] in bert_weight_name: + lr = args.learning_rate + else: + lr = 1e-4 + else: + lr = args.learning_rate + if any(nd in key for nd in no_decay): + optimizer_grouped_parameters += [ + {"params": [value], "lr": lr, "weight_decay": 0.01} + ] + if not any(nd in key for nd in no_decay): + optimizer_grouped_parameters += [ + {"params": [value], "lr": lr, "weight_decay": 0.0} + ] + + if default_gpu: + print(len(list(model.named_parameters())), len(optimizer_grouped_parameters)) + + max_num_iter = max(task_num_iters.values()) + max_batch_size = max(task_batch_size.values()) + + if args.optimizer == 'BertAdam': + optimizer = BertAdam( + optimizer_grouped_parameters, + lr=args.learning_rate, + warmup=args.warmup_proportion, + t_total=num_train_optimization_steps, + schedule='warmup_constant', + ) + elif args.optimizer == 'Adam': + optimizer = Adam( + optimizer_grouped_parameters, + lr=base_lr, + warmup=args.warmup_proportion, + t_total=num_train_optimization_steps, + schedule='warmup_constant', + ) + elif args.optimizer == 'Adamax': + optimizer = Adamax( + optimizer_grouped_parameters, + lr=base_lr, + warmup=args.warmup_proportion, + t_total=num_train_optimization_steps, + schedule='warmup_constant', + ) + + if args.lr_scheduler == 'automatic': + lr_scheduler = ReduceLROnPlateau(optimizer, \ + mode='max', + factor=0.2, + patience=1, + cooldown=1, + threshold=0.001) + elif args.lr_scheduler == 'mannul': + lr_reduce_list = np.array([12, 16]) + # lr_reduce_list = np.array([6, 8, 10]) + def lr_lambda_fun(epoch): + return pow(0.1, np.sum(lr_reduce_list <= epoch)) + lr_scheduler = LambdaLR(optimizer, lr_lambda=lr_lambda_fun) + + if default_gpu: + print("***** Running training *****") + print(" Num Iters: ", task_num_iters) + print(" Batch size: ", task_batch_size) + print(" Num steps: %d" %num_train_optimization_steps) + + startIterID = 0 + # initialize the data iteration. + task_iter_train = {name:None for name in task_ids} + task_count = {name:0 for name in task_ids} + for epochId in tqdm(range(args.num_train_epochs), desc="Epoch"): + model.train() + for step in range(max_num_iter): + iterId = startIterID + step + (epochId * max_num_iter) + for task_id in task_ids: + if iterId >= task_start_iter[task_id]: + # if iterId % task_interval[task_id] == 0: + loss, score = ForwardModelsTrain(args, task_cfg, device, task_id, task_count, task_iter_train, task_dataloader_train, model, task_losses, task_start_iter) + loss = loss * loss_scale[task_id] + if args.gradient_accumulation_steps > 1: + loss = loss / args.gradient_accumulation_steps + + loss.backward() + if (step + 1) % args.gradient_accumulation_steps == 0: + optimizer.step() + model.zero_grad() + + if default_gpu: + tbLogger.step_train(epochId, iterId, float(loss), float(score), optimizer.show_lr(), task_id, 'train') + + if step % (20 * args.gradient_accumulation_steps) == 0 and step != 0 and default_gpu: + tbLogger.showLossTrain() + + model.eval() + # when run evaluate, we run each task sequentially. + for task_id in task_ids: + for i, batch in enumerate(task_dataloader_val[task_id]): + loss, score, batch_size = ForwardModelsVal(args, task_cfg, device, task_id, batch, model, task_losses) + tbLogger.step_val(epochId, float(loss), float(score), task_id, batch_size, 'val') + if default_gpu: + sys.stdout.write('%d/%d\r' % (i, len(task_dataloader_val[task_id]))) + sys.stdout.flush() + + ave_score = tbLogger.showLossVal() + if args.lr_scheduler == 'automatic': + lr_scheduler.step(ave_score) + logger.info("best average score is %3f" %lr_scheduler.best) + else: + lr_scheduler.step() + + if default_gpu: + # Save a trained model + logger.info("** ** * Saving fine - tuned model on " + timeStamp + "** ** * ") + model_to_save = ( + model.module if hasattr(model, "module") else model + ) # Only save the model it-self + + if not os.path.exists(savePath): + os.makedirs(savePath) + output_model_file = os.path.join(savePath, "pytorch_model_" + str(epochId) + ".bin") + torch.save(model_to_save.state_dict(), output_model_file) + + tbLogger.txt_close() + +if __name__ == "__main__": + + main() \ No newline at end of file diff --git a/vilbert/__init__.py b/vilbert/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/vilbert/basebert.py b/vilbert/basebert.py new file mode 100644 index 0000000..f30b301 --- /dev/null +++ b/vilbert/basebert.py @@ -0,0 +1,901 @@ + +from __future__ import absolute_import, division, print_function, unicode_literals + +import copy +import json +import logging +import math +import os +import shutil +import tarfile +import tempfile +import sys +from io import open + +import torch +from torch import nn +import torch.nn.functional as F +from torch.nn import CrossEntropyLoss +from vilbert.utils import cached_path +from pytorch_pretrained_bert.modeling import BertConfig +import pdb +from torch.nn.utils.weight_norm import weight_norm + +# from .file_utils import cached_path +logger = logging.getLogger(__name__) + + +PRETRAINED_MODEL_ARCHIVE_MAP = { + "bert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz", + "bert-large-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz", + "bert-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz", + "bert-large-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz", + "bert-base-multilingual-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz", + "bert-base-multilingual-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz", + "bert-base-chinese": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz", +} +CONFIG_NAME = "bert_config.json" +WEIGHTS_NAME = "pytorch_model.bin" +TF_WEIGHTS_NAME = "model.ckpt" + +def gelu(x): + """Implementation of the gelu activation function. + For information: OpenAI GPT's gelu is slightly different (and gives slightly different results): + 0.5 * x * (1 + torch.tanh(math.sqrt(2 / math.pi) * (x + 0.044715 * torch.pow(x, 3)))) + Also see https://arxiv.org/abs/1606.08415 + """ + return x * 0.5 * (1.0 + torch.erf(x / math.sqrt(2.0))) + + +def swish(x): + return x * torch.sigmoid(x) + +ACT2FN = {"gelu": gelu, "relu": torch.nn.functional.relu, "swish": swish} + + +try: + from apex.normalization.fused_layer_norm import FusedLayerNorm as BertLayerNorm +except ImportError: + logger.info("Better speed can be achieved with apex installed from https://www.github.com/nvidia/apex .") + class BertLayerNorm(nn.Module): + def __init__(self, hidden_size, eps=1e-12): + """Construct a layernorm module in the TF style (epsilon inside the square root). + """ + super(BertLayerNorm, self).__init__() + self.weight = nn.Parameter(torch.ones(hidden_size)) + self.bias = nn.Parameter(torch.zeros(hidden_size)) + self.variance_epsilon = eps + + def forward(self, x): + u = x.mean(-1, keepdim=True) + s = (x - u).pow(2).mean(-1, keepdim=True) + x = (x - u) / torch.sqrt(s + self.variance_epsilon) + return self.weight * x + self.bias + + +class BertPreTrainedModel(nn.Module): + """ An abstract class to handle weights initialization and + a simple interface for dowloading and loading pretrained models. + """ + + def __init__(self, config, *inputs, **kwargs): + super(BertPreTrainedModel, self).__init__() + + if not isinstance(config, BertConfig): + raise ValueError( + "Parameter config in `{}(config)` should be an instance of class `BertConfig`. " + "To create a model from a Google pretrained model use " + "`model = {}.from_pretrained(PRETRAINED_MODEL_NAME)`".format( + self.__class__.__name__, self.__class__.__name__ + ) + ) + + self.config = config + + def init_bert_weights(self, module): + """ Initialize the weights. + """ + if isinstance(module, (nn.Linear, nn.Embedding)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + module.weight.data.normal_(mean=0.0, std=self.config.initializer_range) + elif isinstance(module, BertLayerNorm): + module.bias.data.zero_() + module.weight.data.fill_(1.0) + if isinstance(module, nn.Linear) and module.bias is not None: + module.bias.data.zero_() + + @classmethod + def from_pretrained( + cls, + pretrained_model_name_or_path, + config, + state_dict=None, + cache_dir=None, + from_tf=False, + *inputs, + **kwargs + ): + """ + Instantiate a BertPreTrainedModel from a pre-trained model file or a pytorch state dict. + Download and cache the pre-trained model file if needed. + + Params: + pretrained_model_name_or_path: either: + - a str with the name of a pre-trained model to load selected in the list of: + . `bert-base-uncased` + . `bert-large-uncased` + . `bert-base-cased` + . `bert-large-cased` + . `bert-base-multilingual-uncased` + . `bert-base-multilingual-cased` + . `bert-base-chinese` + - a path or url to a pretrained model archive containing: + . `bert_config.json` a configuration file for the model + . `pytorch_model.bin` a PyTorch dump of a BertForPreTraining instance + - a path or url to a pretrained model archive containing: + . `bert_config.json` a configuration file for the model + . `model.chkpt` a TensorFlow checkpoint + from_tf: should we load the weights from a locally saved TensorFlow checkpoint + cache_dir: an optional path to a folder in which the pre-trained models will be cached. + state_dict: an optional state dictionnary (collections.OrderedDict object) to use instead of Google pre-trained models + *inputs, **kwargs: additional input for the specific Bert class + (ex: num_labels for BertForSequenceClassification) + """ + CONFIG_NAME = "bert_config.json" + WEIGHTS_NAME = "pytorch_model.bin" + TF_WEIGHTS_NAME = "model.ckpt" + + if pretrained_model_name_or_path in PRETRAINED_MODEL_ARCHIVE_MAP: + archive_file = PRETRAINED_MODEL_ARCHIVE_MAP[pretrained_model_name_or_path] + else: + archive_file = pretrained_model_name_or_path + # redirect to the cache, if necessary + try: + resolved_archive_file = cached_path(archive_file, cache_dir=cache_dir) + except EnvironmentError: + logger.error( + "Model name '{}' was not found in model name list ({}). " + "We assumed '{}' was a path or url but couldn't find any file " + "associated to this path or url.".format( + pretrained_model_name_or_path, + ", ".join(PRETRAINED_MODEL_ARCHIVE_MAP.keys()), + archive_file, + ) + ) + return None + + if resolved_archive_file == archive_file: + logger.info("loading archive file {}".format(archive_file)) + else: + logger.info( + "loading archive file {} from cache at {}".format( + archive_file, resolved_archive_file + ) + ) + tempdir = None + if os.path.isdir(resolved_archive_file) or from_tf: + serialization_dir = resolved_archive_file + elif resolved_archive_file[-3:] == 'bin': + serialization_dir = '/'.join(resolved_archive_file.split('/')[:-1]) + WEIGHTS_NAME = resolved_archive_file.split('/')[-1] + else: + # Extract archive to temp dir + tempdir = tempfile.mkdtemp() + logger.info( + "extracting archive file {} to temp dir {}".format( + resolved_archive_file, tempdir + ) + ) + with tarfile.open(resolved_archive_file, "r:gz") as archive: + archive.extractall(tempdir) + serialization_dir = tempdir + # Load config + # config_file = os.path.join(serialization_dir, CONFIG_NAME) + # config = BertConfig.from_json_file(config_file) + logger.info("Model config {}".format(config)) + # Instantiate model. + model = cls(config, *inputs, **kwargs) + if state_dict is None and not from_tf: + weights_path = os.path.join(serialization_dir, WEIGHTS_NAME) + state_dict = torch.load( + weights_path, + map_location="cpu", + ) + if tempdir: + # Clean up temp dir + shutil.rmtree(tempdir) + if from_tf: + # Directly load from a TensorFlow checkpoint + weights_path = os.path.join(serialization_dir, TF_WEIGHTS_NAME) + return load_tf_weights_in_bert(model, weights_path) + # Load from a PyTorch state_dict + old_keys = [] + new_keys = [] + for key in state_dict.keys(): + new_key = None + if "gamma" in key: + new_key = key.replace("gamma", "weight") + if "beta" in key: + new_key = key.replace("beta", "bias") + if new_key: + old_keys.append(key) + new_keys.append(new_key) + for old_key, new_key in zip(old_keys, new_keys): + state_dict[new_key] = state_dict.pop(old_key) + + missing_keys = [] + unexpected_keys = [] + error_msgs = [] + # copy state_dict so _load_from_state_dict can modify it + metadata = getattr(state_dict, "_metadata", None) + state_dict = state_dict.copy() + if metadata is not None: + state_dict._metadata = metadata + + def load(module, prefix=""): + local_metadata = {} if metadata is None else metadata.get(prefix[:-1], {}) + module._load_from_state_dict( + state_dict, + prefix, + local_metadata, + True, + missing_keys, + unexpected_keys, + error_msgs, + ) + for name, child in module._modules.items(): + if child is not None: + load(child, prefix + name + ".") + + start_prefix = "" + if not hasattr(model, "bert") and any( + s.startswith("bert.") for s in state_dict.keys() + ): + start_prefix = "bert." + load(model, prefix=start_prefix) + if len(missing_keys) > 0: + logger.info( + "Weights of {} not initialized from pretrained model: {}".format( + model.__class__.__name__, missing_keys + ) + ) + if len(unexpected_keys) > 0: + logger.info( + "Weights from pretrained model not used in {}: {}".format( + model.__class__.__name__, unexpected_keys + ) + ) + if len(error_msgs) > 0: + raise RuntimeError( + "Error(s) in loading state_dict for {}:\n\t{}".format( + model.__class__.__name__, "\n\t".join(error_msgs) + ) + ) + return model + +class BertEmbeddings(nn.Module): + """Construct the embeddings from word, position and token_type embeddings. + """ + def __init__(self, config): + super(BertEmbeddings, self).__init__() + self.word_embeddings = nn.Embedding(config.vocab_size, config.hidden_size, padding_idx=0) + self.position_embeddings = nn.Embedding(config.max_position_embeddings, config.hidden_size, padding_idx=0) + self.token_type_embeddings = nn.Embedding(config.type_vocab_size, config.hidden_size, padding_idx=0) + + # self.LayerNorm is not snake-cased to stick with TensorFlow model variable name and be able to load + # any TensorFlow checkpoint file + self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12) + self.dropout = nn.Dropout(config.hidden_dropout_prob) + + def forward(self, input_ids, token_type_ids=None): + seq_length = input_ids.size(1) + position_ids = torch.arange(seq_length, dtype=torch.long, device=input_ids.device) + position_ids = position_ids.unsqueeze(0).expand_as(input_ids) + if token_type_ids is None: + token_type_ids = torch.zeros_like(input_ids) + + words_embeddings = self.word_embeddings(input_ids) + position_embeddings = self.position_embeddings(position_ids) + token_type_embeddings = self.token_type_embeddings(token_type_ids) + + embeddings = words_embeddings + position_embeddings + token_type_embeddings + embeddings = self.LayerNorm(embeddings) + embeddings = self.dropout(embeddings) + return embeddings + +class BertImageEmbeddings(nn.Module): + """Construct the embeddings from image, spatial location (omit now) and token_type embeddings. + """ + def __init__(self, config): + super(BertImageEmbeddings, self).__init__() + self.image_embeddings = nn.Linear(2048, config.hidden_size) + # self.position_embeddings = nn.Embedding(config.max_position_embeddings, config.hidden_size, padding_idx=0) + self.token_type_embeddings = nn.Embedding(config.type_vocab_size, config.hidden_size, padding_idx=0) + self.image_location_embeddings = nn.Linear(5, config.hidden_size) + + # self.LayerNorm is not snake-cased to stick with TensorFlow model variable name and be able to load + # any TensorFlow checkpoint file + self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12) + self.dropout = nn.Dropout(config.hidden_dropout_prob) + + def forward(self, input_ids, input_loc, token_type_ids=None): + seq_length = input_ids.size(1) + # position_ids = torch.arange(seq_length, dtype=torch.long, device=input_ids.device) + # position_ids = position_ids.unsqueeze(0).expand_as(input_ids) + if token_type_ids is None: + token_type_ids = torch.zeros_like(input_ids) + + image_embeddings = self.image_embeddings(input_ids) + # position_embeddings = self.position_embeddings(position_ids) + token_type_embeddings = self.token_type_embeddings(token_type_ids) + loc_embeddings = self.image_location_embeddings(input_loc) + + # embeddings = words_embeddings + position_embeddings + token_type_embeddings + embeddings = image_embeddings + token_type_embeddings + loc_embeddings + + embeddings = self.LayerNorm(embeddings) + embeddings = self.dropout(embeddings) + return embeddings + + +class BertSelfAttention(nn.Module): + def __init__(self, config): + super(BertSelfAttention, self).__init__() + if config.hidden_size % config.num_attention_heads != 0: + raise ValueError( + "The hidden size (%d) is not a multiple of the number of attention " + "heads (%d)" % (config.hidden_size, config.num_attention_heads)) + self.num_attention_heads = config.num_attention_heads + self.attention_head_size = int(config.hidden_size / config.num_attention_heads) + self.all_head_size = self.num_attention_heads * self.attention_head_size + + self.query = nn.Linear(config.hidden_size, self.all_head_size) + self.key = nn.Linear(config.hidden_size, self.all_head_size) + self.value = nn.Linear(config.hidden_size, self.all_head_size) + + self.dropout = nn.Dropout(config.attention_probs_dropout_prob) + + def transpose_for_scores(self, x): + new_x_shape = x.size()[:-1] + (self.num_attention_heads, self.attention_head_size) + x = x.view(*new_x_shape) + return x.permute(0, 2, 1, 3) + + def forward(self, hidden_states, attention_mask): + mixed_query_layer = self.query(hidden_states) + mixed_key_layer = self.key(hidden_states) + mixed_value_layer = self.value(hidden_states) + + query_layer = self.transpose_for_scores(mixed_query_layer) + key_layer = self.transpose_for_scores(mixed_key_layer) + value_layer = self.transpose_for_scores(mixed_value_layer) + + # Take the dot product between "query" and "key" to get the raw attention scores. + attention_scores = torch.matmul(query_layer, key_layer.transpose(-1, -2)) + attention_scores = attention_scores / math.sqrt(self.attention_head_size) + # Apply the attention mask is (precomputed for all layers in BertModel forward() function) + attention_scores = attention_scores + attention_mask + + # Normalize the attention scores to probabilities. + attention_probs = nn.Softmax(dim=-1)(attention_scores) + + # This is actually dropping out entire tokens to attend to, which might + # seem a bit unusual, but is taken from the original Transformer paper. + attention_probs = self.dropout(attention_probs) + + context_layer = torch.matmul(attention_probs, value_layer) + context_layer = context_layer.permute(0, 2, 1, 3).contiguous() + new_context_layer_shape = context_layer.size()[:-2] + (self.all_head_size,) + context_layer = context_layer.view(*new_context_layer_shape) + return context_layer + + +class BertSelfOutput(nn.Module): + def __init__(self, config): + super(BertSelfOutput, self).__init__() + self.dense = nn.Linear(config.hidden_size, config.hidden_size) + self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12) + self.dropout = nn.Dropout(config.hidden_dropout_prob) + + def forward(self, hidden_states, input_tensor): + hidden_states = self.dense(hidden_states) + hidden_states = self.dropout(hidden_states) + hidden_states = self.LayerNorm(hidden_states + input_tensor) + return hidden_states + + +class BertAttention(nn.Module): + def __init__(self, config): + super(BertAttention, self).__init__() + self.self = BertSelfAttention(config) + self.output = BertSelfOutput(config) + + def forward(self, input_tensor, attention_mask): + self_output = self.self(input_tensor, attention_mask) + attention_output = self.output(self_output, input_tensor) + return attention_output + + +class BertIntermediate(nn.Module): + def __init__(self, config): + super(BertIntermediate, self).__init__() + self.dense = nn.Linear(config.hidden_size, config.intermediate_size) + if isinstance(config.hidden_act, str) or (sys.version_info[0] == 2 and isinstance(config.hidden_act, unicode)): + self.intermediate_act_fn = ACT2FN[config.hidden_act] + else: + self.intermediate_act_fn = config.hidden_act + + def forward(self, hidden_states): + hidden_states = self.dense(hidden_states) + hidden_states = self.intermediate_act_fn(hidden_states) + return hidden_states + + +class BertOutput(nn.Module): + def __init__(self, config): + super(BertOutput, self).__init__() + self.dense = nn.Linear(config.intermediate_size, config.hidden_size) + self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12) + self.dropout = nn.Dropout(config.hidden_dropout_prob) + + def forward(self, hidden_states, input_tensor): + hidden_states = self.dense(hidden_states) + hidden_states = self.dropout(hidden_states) + hidden_states = self.LayerNorm(hidden_states + input_tensor) + return hidden_states + + +class BertLayer(nn.Module): + def __init__(self, config): + super(BertLayer, self).__init__() + self.attention = BertAttention(config) + self.intermediate = BertIntermediate(config) + self.output = BertOutput(config) + + def forward(self, hidden_states, attention_mask): + attention_output = self.attention(hidden_states, attention_mask) + intermediate_output = self.intermediate(attention_output) + layer_output = self.output(intermediate_output, attention_output) + return layer_output + + +class BertEncoder(nn.Module): + def __init__(self, config): + super(BertEncoder, self).__init__() + layer = BertLayer(config) + self.layer = nn.ModuleList([copy.deepcopy(layer) for _ in range(config.num_hidden_layers)]) + + def forward(self, hidden_states, attention_mask, output_all_encoded_layers=True): + all_encoder_layers = [] + for layer_module in self.layer: + hidden_states = layer_module(hidden_states, attention_mask) + if output_all_encoded_layers: + all_encoder_layers.append(hidden_states) + if not output_all_encoded_layers: + all_encoder_layers.append(hidden_states) + return all_encoder_layers + + +class BertPooler(nn.Module): + def __init__(self, config): + super(BertPooler, self).__init__() + self.dense = nn.Linear(config.hidden_size, config.hidden_size) + self.activation = nn.Tanh() + + def forward(self, hidden_states): + # We "pool" the model by simply taking the hidden state corresponding + # to the first token. + first_token_tensor = hidden_states[:, 0] + pooled_output = self.dense(first_token_tensor) + pooled_output = self.activation(pooled_output) + return pooled_output + + +class BertPredictionHeadTransform(nn.Module): + def __init__(self, config): + super(BertPredictionHeadTransform, self).__init__() + self.dense = nn.Linear(config.hidden_size, config.hidden_size) + if isinstance(config.hidden_act, str) or (sys.version_info[0] == 2 and isinstance(config.hidden_act, unicode)): + self.transform_act_fn = ACT2FN[config.hidden_act] + else: + self.transform_act_fn = config.hidden_act + self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12) + + def forward(self, hidden_states): + hidden_states = self.dense(hidden_states) + hidden_states = self.transform_act_fn(hidden_states) + hidden_states = self.LayerNorm(hidden_states) + return hidden_states + + +class BertLMPredictionHead(nn.Module): + def __init__(self, config, bert_model_embedding_weights): + super(BertLMPredictionHead, self).__init__() + self.transform = BertPredictionHeadTransform(config) + + # The output weights are the same as the input embeddings, but there is + # an output-only bias for each token. + self.decoder = nn.Linear(bert_model_embedding_weights.size(1), + bert_model_embedding_weights.size(0), + bias=False) + self.decoder.weight = bert_model_embedding_weights + self.bias = nn.Parameter(torch.zeros(bert_model_embedding_weights.size(0))) + + def forward(self, hidden_states): + hidden_states = self.transform(hidden_states) + hidden_states = self.decoder(hidden_states) + self.bias + return hidden_states + + +class BertOnlyMLMHead(nn.Module): + def __init__(self, config, bert_model_embedding_weights): + super(BertOnlyMLMHead, self).__init__() + self.predictions = BertLMPredictionHead(config, bert_model_embedding_weights) + + def forward(self, sequence_output): + prediction_scores = self.predictions(sequence_output) + return prediction_scores + + +class BertOnlyNSPHead(nn.Module): + def __init__(self, config): + super(BertOnlyNSPHead, self).__init__() + self.seq_relationship = nn.Linear(config.hidden_size, 2) + + def forward(self, pooled_output): + seq_relationship_score = self.seq_relationship(pooled_output) + return seq_relationship_score + +class BertPredictionHeadTransform(nn.Module): + def __init__(self, config): + super(BertPredictionHeadTransform, self).__init__() + self.dense = nn.Linear(config.hidden_size, config.hidden_size) + if isinstance(config.hidden_act, str) or (sys.version_info[0] == 2 and isinstance(config.hidden_act, unicode)): + self.transform_act_fn = ACT2FN[config.hidden_act] + else: + self.transform_act_fn = config.hidden_act + self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12) + + def forward(self, hidden_states): + hidden_states = self.dense(hidden_states) + hidden_states = self.transform_act_fn(hidden_states) + hidden_states = self.LayerNorm(hidden_states) + return hidden_states + + +class BertLMPredictionHead(nn.Module): + def __init__(self, config, bert_model_embedding_weights): + super(BertLMPredictionHead, self).__init__() + self.transform = BertPredictionHeadTransform(config) + + # The output weights are the same as the input embeddings, but there is + # an output-only bias for each token. + self.decoder = nn.Linear(bert_model_embedding_weights.size(1), + bert_model_embedding_weights.size(0), + bias=False) + self.decoder.weight = bert_model_embedding_weights + self.bias = nn.Parameter(torch.zeros(bert_model_embedding_weights.size(0))) + + def forward(self, hidden_states): + hidden_states = self.transform(hidden_states) + hidden_states = self.decoder(hidden_states) + self.bias + return hidden_states + +class BertImagePredictionHead(nn.Module): + def __init__(self, config, bert_model_embedding_weights): + super(BertImagePredictionHead, self).__init__() + self.transform = BertPredictionHeadTransform(config) + + # The output weights are the same as the input embeddings, but there is + # an output-only bias for each token. + self.decoder = nn.Linear(bert_model_embedding_weights.size(1), + 1601) + + def forward(self, hidden_states): + hidden_states = self.transform(hidden_states) + hidden_states = self.decoder(hidden_states) + return hidden_states + + +class BertPreTrainingHeads(nn.Module): + def __init__(self, config, bert_model_embedding_weights): + super(BertPreTrainingHeads, self).__init__() + self.predictions = BertLMPredictionHead(config, bert_model_embedding_weights) + self.seq_relationship = nn.Linear(config.hidden_size, 2) + self.imagePredictions = BertImagePredictionHead(config, bert_model_embedding_weights) + + def forward(self, sequence_output_t, sequence_output_v, pooled_output): + + img_prediction_scores = self.imagePredictions(sequence_output_v) #sequence_output[:,txt_len:]) + prediction_scores = self.predictions(sequence_output_t) #sequence_output[:,:txt_len]) + + seq_relationship_score = self.seq_relationship(pooled_output) + return img_prediction_scores, prediction_scores, seq_relationship_score + +class BertModel(BertPreTrainedModel): + """BERT model ("Bidirectional Embedding Representations from a Transformer"). + Params: + config: a BertConfig class instance with the configuration to build a new model + Inputs: + `input_ids`: a torch.LongTensor of shape [batch_size, sequence_length] + with the word token indices in the vocabulary(see the tokens preprocessing logic in the scripts + `extract_features.py`, `run_classifier.py` and `run_squad.py`) + `token_type_ids`: an optional torch.LongTensor of shape [batch_size, sequence_length] with the token + types indices selected in [0, 1]. Type 0 corresponds to a `sentence A` and type 1 corresponds to + a `sentence B` token (see BERT paper for more details). + `attention_mask`: an optional torch.LongTensor of shape [batch_size, sequence_length] with indices + selected in [0, 1]. It's a mask to be used if the input sequence length is smaller than the max + input sequence length in the current batch. It's the mask that we typically use for attention when + a batch has varying length sentences. + `output_all_encoded_layers`: boolean which controls the content of the `encoded_layers` output as described below. Default: `True`. + Outputs: Tuple of (encoded_layers, pooled_output) + `encoded_layers`: controled by `output_all_encoded_layers` argument: + - `output_all_encoded_layers=True`: outputs a list of the full sequences of encoded-hidden-states at the end + of each attention block (i.e. 12 full sequences for BERT-base, 24 for BERT-large), each + encoded-hidden-state is a torch.FloatTensor of size [batch_size, sequence_length, hidden_size], + - `output_all_encoded_layers=False`: outputs only the full sequence of hidden-states corresponding + to the last attention block of shape [batch_size, sequence_length, hidden_size], + `pooled_output`: a torch.FloatTensor of size [batch_size, hidden_size] which is the output of a + classifier pretrained on top of the hidden state associated to the first character of the + input (`CLS`) to train on the Next-Sentence task (see BERT's paper). + Example usage: + ```python + # Already been converted into WordPiece token ids + input_ids = torch.LongTensor([[31, 51, 99], [15, 5, 0]]) + input_mask = torch.LongTensor([[1, 1, 1], [1, 1, 0]]) + token_type_ids = torch.LongTensor([[0, 0, 1], [0, 1, 0]]) + config = modeling.BertConfig(vocab_size_or_config_json_file=32000, hidden_size=768, + num_hidden_layers=12, num_attention_heads=12, intermediate_size=3072) + model = modeling.BertModel(config=config) + all_encoder_layers, pooled_output = model(input_ids, token_type_ids, input_mask) + ``` + """ + def __init__(self, config): + super(BertModel, self).__init__(config) + self.image_embeddings = BertImageEmbeddings(config) + self.embeddings = BertEmbeddings(config) + self.encoder = BertEncoder(config) + self.pooler = BertPooler(config) + self.apply(self.init_bert_weights) + + def forward( + self, + input_txt, + input_imgs, + image_loc, + token_type_ids=None, + attention_mask=None, + image_attention_mask=None, + output_all_encoded_layers=True, + ): + + if attention_mask is None: + attention_mask = torch.ones_like(input_txt) + if token_type_ids is None: + token_type_ids = torch.zeros_like(input_txt) + if image_attention_mask is None: + image_attention_mask = torch.ones( + input_imgs.size(0), input_imgs.size(1) + ).type_as(input_txt) + + image_token_type_ids = torch.ones(input_imgs.size(0), input_imgs.size(1)).type_as(token_type_ids) + + # We create a 3D attention mask from a 2D tensor mask. + # Sizes are [batch_size, 1, 1, to_seq_length] + # So we can broadcast to [batch_size, num_heads, from_seq_length, to_seq_length] + # this attention mask is more simple than the triangular masking of causal attention + # used in OpenAI GPT, we just need to prepare the broadcast dimension here. + extended_attention_mask = attention_mask.unsqueeze(1).unsqueeze(2) + extended_image_attention_mask = image_attention_mask.unsqueeze(1).unsqueeze(2) + + # Since attention_mask is 1.0 for positions we want to attend and 0.0 for + # masked positions, this operation will create a tensor which is 0.0 for + # positions we want to attend and -10000.0 for masked positions. + # Since we are adding it to the raw scores before the softmax, this is + # effectively the same as removing these entirely. + extended_attention_mask = extended_attention_mask.to(dtype=next(self.parameters()).dtype) # fp16 compatibility + extended_attention_mask = (1.0 - extended_attention_mask) * -10000.0 + + extended_image_attention_mask = extended_image_attention_mask.to( + dtype=next(self.parameters()).dtype + ) # fp16 compatibility + extended_image_attention_mask = (1.0 - extended_image_attention_mask) * -10000.0 + + img_embeding_output = self.image_embeddings(input_imgs, image_loc, image_token_type_ids) + + embedding_output = self.embeddings(input_txt, token_type_ids) + + embedding_output = torch.cat([embedding_output, img_embeding_output], dim=1) + extended_attention_mask = torch.cat([extended_attention_mask, extended_image_attention_mask], dim=3) + + encoded_layers = self.encoder(embedding_output, + extended_attention_mask, + output_all_encoded_layers=output_all_encoded_layers) + + sequence_output = encoded_layers[-1] + pooled_output = self.pooler(sequence_output) + if not output_all_encoded_layers: + encoded_layers = encoded_layers[-1] + return encoded_layers, pooled_output + +class BertForMultiModalPreTraining(BertPreTrainedModel): + """BERT model with multi modal pre-training heads. + This module comprises the BERT model followed by the two pre-training heads: + - the masked multi modal modeling head, and + - the image caption classification head. + + Params: + config: a BertConfig class instance with the configuration to build a new model. + + Inputs: + `input_ids`: a torch.LongTensor of shape [batch_size, sequence_length] + with the word token indices in the vocabulary(see the tokens preprocessing logic in the scripts + `extract_features.py`, `run_classifier.py` and `run_squad.py`) + `token_type_ids`: an optional torch.LongTensor of shape [batch_size, sequence_length] with the token + types indices selected in [0, 1]. Type 0 corresponds to a `sentence A` and type 1 corresponds to + a `sentence B` token (see BERT paper for more details). + `attention_mask`: an optional torch.LongTensor of shape [batch_size, sequence_length] with indices + selected in [0, 1]. It's a mask to be used if the input sequence length is smaller than the max + input sequence length in the current batch. It's the mask that we typically use for attention when + a batch has varying length sentences. + `masked_lm_labels`: optional masked language modeling labels: torch.LongTensor of shape [batch_size, sequence_length] + with indices selected in [-1, 0, ..., vocab_size]. All labels set to -1 are ignored (masked), the loss + is only computed for the labels set in [0, ..., vocab_size] + `next_sentence_label`: optional next sentence classification loss: torch.LongTensor of shape [batch_size] + with indices selected in [0, 1]. + 0 => next sentence is the continuation, 1 => next sentence is a random sentence. + + Outputs: + if `masked_lm_labels` and `next_sentence_label` are not `None`: + Outputs the total_loss which is the sum of the masked language modeling loss and the next + sentence classification loss. + if `masked_lm_labels` or `next_sentence_label` is `None`: + Outputs a tuple comprising + - the masked language modeling logits of shape [batch_size, sequence_length, vocab_size], and + - the next sentence classification logits of shape [batch_size, 2]. + + Example usage: + ```python + # Already been converted into WordPiece token ids + input_ids = torch.LongTensor([[31, 51, 99], [15, 5, 0]]) + input_mask = torch.LongTensor([[1, 1, 1], [1, 1, 0]]) + token_type_ids = torch.LongTensor([[0, 0, 1], [0, 1, 0]]) + + config = BertConfig(vocab_size_or_config_json_file=32000, hidden_size=768, + num_hidden_layers=12, num_attention_heads=12, intermediate_size=3072) + + model = BertForPreTraining(config) + masked_lm_logits_scores, seq_relationship_logits = model(input_ids, token_type_ids, input_mask) + ``` + """ + def __init__(self, config): + super(BertForMultiModalPreTraining, self).__init__(config) + self.bert = BertModel(config) + self.cls = BertPreTrainingHeads(config, self.bert.embeddings.word_embeddings.weight) + self.apply(self.init_bert_weights) + + self.vis_criterion = nn.KLDivLoss(reduction="none") + self.loss_fct = CrossEntropyLoss(ignore_index=-1) + + def forward( + self, + input_ids, + image_feat, + image_target, + image_loc, + token_type_ids=None, + attention_mask=None, + image_attention_mask=None, + masked_lm_labels=None, + image_label=None, + next_sentence_label=None, + ): + # in this model, we first embed the images. + + sequence_output, pooled_output = self.bert( + input_ids, + image_feat, + image_loc, + token_type_ids, + attention_mask, + image_attention_mask, + output_all_encoded_layers=False, + ) + + prediction_scores_v, prediction_scores_t, seq_relationship_score = self.cls(sequence_output, pooled_output, input_ids.size(1), image_feat.size(1)) + + if masked_lm_labels is not None and next_sentence_label is not None: + + prediction_scores_v = prediction_scores_v[:, 1:] + + img_loss = self.vis_criterion( + F.log_softmax(prediction_scores_v, dim=2), image_target + ) + masked_img_loss = torch.sum( + img_loss * (image_label == 1).unsqueeze(2).float() + ) / max(torch.sum((image_label == 1)), 0) + + masked_lm_loss = self.loss_fct( + prediction_scores_t.view(-1, self.config.vocab_size), + masked_lm_labels.view(-1), + ) + + next_sentence_loss = self.loss_fct( + seq_relationship_score.view(-1, 2), next_sentence_label.view(-1) + ) + + return masked_lm_loss, masked_img_loss, next_sentence_loss + else: + return prediction_scores, seq_relationship_score, prediction_scores_v + +class BaseBertForVLTasks(BertPreTrainedModel): + def __init__(self, config, num_labels, dropout_prob=0.1, default_gpu=True): + super(BaseBertForVLTasks, self).__init__(config) + self.num_labels = num_labels + self.bert = BertModel(config) + self.dropout = nn.Dropout(dropout_prob) + self.cls = BertPreTrainingHeads( + config, self.bert.embeddings.word_embeddings.weight + ) + self.vil_prediction = SimpleClassifier(config.hidden_size, config.hidden_size*2, num_labels, 0.5) + # self.vil_prediction = nn.Linear(config.bi_hidden_size, num_labels) + self.vil_logit = nn.Linear(config.hidden_size, 1) + self.vision_logit = nn.Linear(config.hidden_size, 1) + self.linguisic_logit = nn.Linear(config.hidden_size, 1) + self.apply(self.init_bert_weights) + + def forward( + self, + input_txt, + input_imgs, + image_loc, + token_type_ids=None, + attention_mask=None, + image_attention_mask=None, + co_attention_mask=None, + output_all_encoded_layers=False, + ): + sequence_output, pooled_output = self.bert( + input_txt, + input_imgs, + image_loc, + token_type_ids, + attention_mask, + image_attention_mask, + output_all_encoded_layers=output_all_encoded_layers, + ) + + sequence_output_v = sequence_output[:,input_txt.size(1):] + sequence_output_t = sequence_output[:,:input_txt.size(1)] + + vil_prediction = 0 + vil_logit = 0 + vil_binary_prediction = 0 + vision_prediction = 0 + vision_logit = 0 + linguisic_prediction = 0 + linguisic_logit = 0 + + vision_prediction, linguisic_prediction, vil_binary_prediction = self.cls(sequence_output_t, sequence_output_v, pooled_output) + + vil_prediction = self.vil_prediction(pooled_output) + vil_logit = self.vil_logit(pooled_output) + vision_logit = self.vision_logit(self.dropout(sequence_output_v)) + ((1.0 - image_attention_mask)* -10000.0).unsqueeze(2).to(dtype=next(self.parameters()).dtype) + linguisic_logit = self.linguisic_logit(self.dropout(sequence_output_t)) + + return vil_prediction, vil_logit, vil_binary_prediction, vision_prediction, vision_logit, linguisic_prediction, linguisic_logit + +class SimpleClassifier(nn.Module): + def __init__(self, in_dim, hid_dim, out_dim, dropout): + super(SimpleClassifier, self).__init__() + layers = [ + weight_norm(nn.Linear(in_dim, hid_dim), dim=None), + nn.ReLU(), + nn.Dropout(dropout, inplace=True), + weight_norm(nn.Linear(hid_dim, out_dim), dim=None) + ] + self.main = nn.Sequential(*layers) + + def forward(self, x): + logits = self.main(x) + return logits diff --git a/vilbert/datasets/__init__.py b/vilbert/datasets/__init__.py new file mode 100644 index 0000000..a06c537 --- /dev/null +++ b/vilbert/datasets/__init__.py @@ -0,0 +1,36 @@ +from .concept_cap_dataset import ConceptCapLoaderTrain, ConceptCapLoaderVal, ConceptCapLoaderRetrieval +from .vqa_dataset import VQAClassificationDataset +from .refer_expression_dataset import ReferExpressionDataset +from .retreival_dataset import RetreivalDataset, RetreivalDatasetVal +from .vcr_dataset import VCRDataset + + + +# from .flickr_retreival_dataset import FlickrRetreivalDatasetTrain, FlickrRetreivalDatasetVal + +__all__ = ["FoilClassificationDataset", \ + "VQAClassificationDataset", \ + "ConceptCapLoaderTrain", \ + "ConceptCapLoaderVal", \ + "ReferExpressionDataset", \ + "RetreivalDataset", \ + "RetreivalDatasetVal",\ + "VCRDataset", \ + "ConceptCapLoaderRetrieval"] + +DatasetMapTrain = {'TASK0': ConceptCapLoaderTrain, + 'TASK1': VQAClassificationDataset, + 'TASK2': VCRDataset, + 'TASK3': VCRDataset, + 'TASK4': RetreivalDataset, + 'TASK5': ReferExpressionDataset, + } + + +DatasetMapEval = {'TASK0': ConceptCapLoaderVal, + 'TASK1': VQAClassificationDataset, + 'TASK2': VCRDataset, + 'TASK3': VCRDataset, + 'TASK4': RetreivalDatasetVal, + 'TASK5': ReferExpressionDataset, + } diff --git a/vilbert/datasets/_image_features_reader.py b/vilbert/datasets/_image_features_reader.py new file mode 100644 index 0000000..8735f9d --- /dev/null +++ b/vilbert/datasets/_image_features_reader.py @@ -0,0 +1,138 @@ +from typing import List +import csv +import h5py +import numpy as np +import copy +import pickle +import lmdb # install lmdb by "pip install lmdb" +import base64 +import pdb + +class ImageFeaturesH5Reader(object): + """ + A reader for H5 files containing pre-extracted image features. A typical + H5 file is expected to have a column named "image_id", and another column + named "features". + + Example of an H5 file: + ``` + faster_rcnn_bottomup_features.h5 + |--- "image_id" [shape: (num_images, )] + |--- "features" [shape: (num_images, num_proposals, feature_size)] + +--- .attrs ("split", "train") + ``` + # TODO (kd): Add support to read boxes, classes and scores. + + Parameters + ---------- + features_h5path : str + Path to an H5 file containing COCO train / val image features. + in_memory : bool + Whether to load the whole H5 file in memory. Beware, these files are + sometimes tens of GBs in size. Set this to true if you have sufficient + RAM - trade-off between speed and memory. + """ + def __init__(self, features_path: str, in_memory: bool = False): + self.features_path = features_path + self._in_memory = in_memory + + # with h5py.File(self.features_h5path, "r", libver='latest', swmr=True) as features_h5: + # self._image_ids = list(features_h5["image_ids"]) + # If not loaded in memory, then list of None. + self.env = lmdb.open(self.features_path, max_readers=1, readonly=True, + lock=False, readahead=False, meminit=False) + + with self.env.begin(write=False) as txn: + self._image_ids = pickle.loads(txn.get('keys'.encode())) + + self.features = [None] * len(self._image_ids) + self.num_boxes = [None] * len(self._image_ids) + self.boxes = [None] * len(self._image_ids) + self.boxes_ori = [None] * len(self._image_ids) + + def __len__(self): + return len(self._image_ids) + + def __getitem__(self, image_id): + image_id = str(image_id).encode() + index = self._image_ids.index(image_id) + if self._in_memory: + # Load features during first epoch, all not loaded together as it + # has a slow start. + if self.features[index] is not None: + features = self.features[index] + num_boxes = self.num_boxes[index] + image_location = self.boxes[index] + image_location_ori = self.boxes_ori[index] + else: + with self.env.begin(write=False) as txn: + item = pickle.loads(txn.get(image_id)) + image_id = item['image_id'] + image_h = int(item['image_h']) + image_w = int(item['image_w']) + num_boxes = int(item['num_boxes']) + + features = np.frombuffer(base64.b64decode(item["features"]), dtype=np.float32).reshape(num_boxes, 2048) + boxes = np.frombuffer(base64.b64decode(item['boxes']), dtype=np.float32).reshape(num_boxes, 4) + + g_feat = np.sum(features, axis=0) / num_boxes + num_boxes = num_boxes + 1 + + features = np.concatenate([np.expand_dims(g_feat, axis=0), features], axis=0) + self.features[index] = features + + image_location = np.zeros((boxes.shape[0], 5), dtype=np.float32) + image_location[:,:4] = boxes + image_location[:,4] = (image_location[:,3] - image_location[:,1]) * (image_location[:,2] - image_location[:,0]) / (float(image_w) * float(image_h)) + + image_location_ori = copy.deepcopy(image_location) + + image_location[:,0] = image_location[:,0] / float(image_w) + image_location[:,1] = image_location[:,1] / float(image_h) + image_location[:,2] = image_location[:,2] / float(image_w) + image_location[:,3] = image_location[:,3] / float(image_h) + + g_location = np.array([0,0,1,1,1]) + image_location = np.concatenate([np.expand_dims(g_location, axis=0), image_location], axis=0) + self.boxes[index] = image_location + + g_location_ori = np.array([0,0,image_w,image_h,image_w*image_h]) + image_location_ori = np.concatenate([np.expand_dims(g_location_ori, axis=0), image_location_ori], axis=0) + self.boxes_ori[index] = image_location_ori + self.num_boxes[index] = num_boxes + else: + # Read chunk from file everytime if not loaded in memory. + with self.env.begin(write=False) as txn: + item = pickle.loads(txn.get(image_id)) + image_id = item['image_id'] + image_h = int(item['image_h']) + image_w = int(item['image_w']) + num_boxes = int(item['num_boxes']) + + features = np.frombuffer(base64.b64decode(item["features"]), dtype=np.float32).reshape(num_boxes, 2048) + boxes = np.frombuffer(base64.b64decode(item['boxes']), dtype=np.float32).reshape(num_boxes, 4) + g_feat = np.sum(features, axis=0) / num_boxes + num_boxes = num_boxes + 1 + features = np.concatenate([np.expand_dims(g_feat, axis=0), features], axis=0) + + image_location = np.zeros((boxes.shape[0], 5), dtype=np.float32) + image_location[:,:4] = boxes + image_location[:,4] = (image_location[:,3] - image_location[:,1]) * (image_location[:,2] - image_location[:,0]) / (float(image_w) * float(image_h)) + + image_location_ori = copy.deepcopy(image_location) + image_location[:,0] = image_location[:,0] / float(image_w) + image_location[:,1] = image_location[:,1] / float(image_h) + image_location[:,2] = image_location[:,2] / float(image_w) + image_location[:,3] = image_location[:,3] / float(image_h) + + g_location = np.array([0,0,1,1,1]) + image_location = np.concatenate([np.expand_dims(g_location, axis=0), image_location], axis=0) + + g_location_ori = np.array([0,0,image_w,image_h,image_w*image_h]) + image_location_ori = np.concatenate([np.expand_dims(g_location_ori, axis=0), image_location_ori], axis=0) + + return features, num_boxes, image_location, image_location_ori + + def keys(self) -> List[int]: + return self._image_ids + diff --git a/vilbert/datasets/concept_cap_dataset.py b/vilbert/datasets/concept_cap_dataset.py new file mode 100644 index 0000000..c332208 --- /dev/null +++ b/vilbert/datasets/concept_cap_dataset.py @@ -0,0 +1,872 @@ +import copy +import json +import logging +import os +import random + +import lmdb +import numpy as np +import tensorpack.dataflow as td + +import torch +from torch.utils.data import Dataset +from torch.utils.data.sampler import Sampler +import torch.distributed as dist +import sys +import pdb + +logging.basicConfig( + format="%(asctime)s - %(levelname)s - %(name)s - %(message)s", + datefmt="%m/%d/%Y %H:%M:%S", + level=logging.INFO, +) +logger = logging.getLogger(__name__) + + +class InputExample(object): + """A single training/test example for the language model.""" + + def __init__( + self, image_feat=None, image_target=None, caption=None, is_next=None, lm_labels=None, image_loc=None, num_boxes=None + ): + """Constructs a InputExample. + Args: + guid: Unique id for the example. + tokens_a: string. The untokenized text of the first sequence. For single + sequence tasks, only this sequence must be specified. + tokens_b: (Optional) string. The untokenized text of the second sequence. + Only must be specified for sequence pair tasks. + label: (Optional) string. The label of the example. This should be + specified for train and dev examples, but not for test examples. + """ + self.image_feat = image_feat + self.caption = caption + self.is_next = is_next # nextSentence + self.lm_labels = lm_labels # masked words for language model + self.image_loc = image_loc + self.image_target = image_target + self.num_boxes = num_boxes + +class InputFeatures(object): + """A single set of features of data.""" + + def __init__( + self, + input_ids=None, + input_mask=None, + segment_ids=None, + is_next=None, + lm_label_ids=None, + image_feat=None, + image_target=None, + image_loc=None, + image_label=None, + image_mask=None + ): + self.input_ids = input_ids + self.input_mask = input_mask + self.segment_ids = segment_ids + self.is_next = is_next + self.lm_label_ids = lm_label_ids + self.image_feat = image_feat + self.image_loc = image_loc + self.image_label = image_label + self.image_target = image_target + self.image_mask = image_mask + +class ConceptCapLoaderTrain(object): + """ + Data loader. Combines a dataset and a sampler, and provides + single- or multi-process iterators over the dataset. + Arguments: + mode (str, required): mode of dataset to operate in, one of ['train', 'val'] + batch_size (int, optional): how many samples per batch to load + (default: 1). + shuffle (bool, optional): set to ``True`` to have the data reshuffled + at every epoch (default: False). + num_workers (int, optional): how many subprocesses to use for data + loading. 0 means that the data will be loaded in the main process + (default: 0) + cache (int, optional): cache size to use when loading data, + drop_last (bool, optional): set to ``True`` to drop the last incomplete batch, + if the dataset size is not divisible by the batch size. If ``False`` and + the size of dataset is not divisible by the batch size, then the last batch + will be smaller. (default: False) + cuda (bool, optional): set to ``True`` and the PyTorch tensors will get preloaded + to the GPU for you (necessary because this lets us to uint8 conversion on the + GPU, which is faster). + """ + + def __init__( + self, + corpus_path, + tokenizer, + seq_len, + encoding="utf-8", + predict_feature=False, + hard_negative=False, + batch_size=512, + shuffle=False, + num_workers=25, + cache=50000, + drop_last=False, + cuda=False, + distributed=False, + visualization=False, + ): + + if dist.is_available() and distributed: + num_replicas = dist.get_world_size() + # assert num_replicas == 8 + rank = dist.get_rank() + lmdb_file = "/coc/dataset/conceptual_caption/training_feat_part_" + str(rank) + ".lmdb" + # if not os.path.exists(lmdb_file): + # lmdb_file = "/srv/share/datasets/conceptual_caption/training_feat_part_" + str(rank) + ".lmdb" + else: + # lmdb_file = "/coc/dataset/conceptual_caption/training_feat_all.lmdb" + # if not os.path.exists(lmdb_file): + lmdb_file = "/coc/pskynet2/jlu347/multi-modal-bert/data/conceptual_caption/training_feat_all.lmdb" + + caption_path = "/coc/pskynet2/jlu347/multi-modal-bert/data/conceptual_caption/caption_train.json" + print("Loading from %s" % lmdb_file) + + ds = td.LMDBSerializer.load(lmdb_file, shuffle=False) + self.num_dataset = len(ds) + + preprocess_function = BertPreprocessBatch( + caption_path, + tokenizer, + seq_len, + 36, + self.num_dataset, + encoding="utf-8", + predict_feature=predict_feature, + ) + + ds = td.LocallyShuffleData(ds, cache) + ds = td.PrefetchData(ds, 5000, 1) + ds = td.MapData(ds, preprocess_function) + # self.ds = td.PrefetchData(ds, 1) + ds = td.PrefetchDataZMQ(ds, num_workers) + self.ds = td.BatchData(ds, batch_size) + # self.ds = ds + self.ds.reset_state() + + self.batch_size = batch_size + self.num_workers = num_workers + + def __iter__(self): + + for batch in self.ds.get_data(): + input_ids, input_mask, segment_ids, lm_label_ids, is_next, image_feat, \ + image_loc, image_target, image_label, image_mask, image_id = batch + + batch_size = input_ids.shape[0] + g_image_feat = np.sum(image_feat, axis=1) / np.sum(image_mask, axis=1, keepdims=True) + image_feat = np.concatenate([np.expand_dims(g_image_feat, axis=1), image_feat], axis=1) + image_feat = np.array(image_feat, dtype=np.float32) + + g_image_loc = np.repeat(np.array([[0,0,1,1,1]], dtype=np.float32), batch_size, axis=0) + image_loc = np.concatenate([np.expand_dims(g_image_loc, axis=1), image_loc], axis=1) + + image_loc = np.array(image_loc, dtype=np.float32) + g_image_mask = np.repeat(np.array([[1]]), batch_size, axis=0) + image_mask = np.concatenate([g_image_mask, image_mask], axis=1) + + batch = (input_ids, input_mask, segment_ids, lm_label_ids, is_next, image_feat, \ + image_loc, image_target, image_label, image_mask, image_id) + + yield tuple(torch.tensor(data) for data in batch) + + def __len__(self): + return self.ds.size() + +class ConceptCapLoaderVal(object): + """ + Data loader. Combines a dataset and a sampler, and provides + single- or multi-process iterators over the dataset. + Arguments: + mode (str, required): mode of dataset to operate in, one of ['train', 'val'] + batch_size (int, optional): how many samples per batch to load + (default: 1). + shuffle (bool, optional): set to ``True`` to have the data reshuffled + at every epoch (default: False). + num_workers (int, optional): how many subprocesses to use for data + loading. 0 means that the data will be loaded in the main process + (default: 0) + cache (int, optional): cache size to use when loading data, + drop_last (bool, optional): set to ``True`` to drop the last incomplete batch, + if the dataset size is not divisible by the batch size. If ``False`` and + the size of dataset is not divisible by the batch size, then the last batch + will be smaller. (default: False) + cuda (bool, optional): set to ``True`` and the PyTorch tensors will get preloaded + to the GPU for you (necessary because this lets us to uint8 conversion on the + GPU, which is faster). + """ + + def __init__( + self, + corpus_path, + tokenizer, + seq_len, + encoding="utf-8", + predict_feature=False, + batch_size=512, + shuffle=False, + num_workers=25, + cache=50000, + drop_last=False, + cuda=False, + distributed=False, + visualization=False, + ): + + lmdb_file = "/coc/dataset/conceptual_caption/validation_feat_all.lmdb" + if not os.path.exists(lmdb_file): + lmdb_file = "/coc/pskynet2/jlu347/multi-modal-bert/data/conceptual_caption/validation_feat_all.lmdb" + caption_path = "/coc/pskynet2/jlu347/multi-modal-bert/data/conceptual_caption/caption_val.json" + + print("Loading from %s" % lmdb_file) + + ds = td.LMDBSerializer.load(lmdb_file, shuffle=False) + self.num_dataset = len(ds) + preprocess_function = BertPreprocessBatch( + caption_path, + tokenizer, + seq_len, + 36, + self.num_dataset, + encoding="utf-8", + predict_feature=predict_feature, + visualization=visualization, + ) + + ds = td.MapData(ds, preprocess_function) + self.ds = td.BatchData(ds, batch_size) + self.ds.reset_state() + + self.batch_size = batch_size + self.num_workers = num_workers + + def __iter__(self): + for batch in self.ds.get_data(): + input_ids, input_mask, segment_ids, lm_label_ids, is_next, image_feat, \ + image_loc, image_target, image_label, image_mask, image_id = batch + + batch_size = input_ids.shape[0] + g_image_feat = np.sum(image_feat, axis=1) / np.sum(image_mask, axis=1, keepdims=True) + image_feat = np.concatenate([np.expand_dims(g_image_feat, axis=1), image_feat], axis=1) + image_feat = np.array(image_feat, dtype=np.float32) + + g_image_loc = np.repeat(np.array([[0,0,1,1,1]], dtype=np.float32), batch_size, axis=0) + image_loc = np.concatenate([np.expand_dims(g_image_loc, axis=1), image_loc], axis=1) + + image_loc = np.array(image_loc, dtype=np.float32) + g_image_mask = np.repeat(np.array([[1]]), batch_size, axis=0) + image_mask = np.concatenate([g_image_mask, image_mask], axis=1) + + # batch = (input_ids, input_mask, segment_ids, lm_label_ids, is_next, image_feat, \ + # image_loc, image_target, image_label, image_mask, image_id) + batch = (input_ids, input_mask, segment_ids, lm_label_ids, is_next, image_feat, \ + image_loc, image_target, image_label, image_mask) + + + yield tuple([torch.tensor(data) for data in batch] + [image_id]) + + def __len__(self): + return self.ds.size() + + +class BertPreprocessBatch(object): + def __init__( + self, + caption_path, + tokenizer, + seq_len, + region_len, + data_size, + split="Train", + encoding="utf-8", + predict_feature=False, + visualization=False + ): + + self.split = split + self.seq_len = seq_len + self.region_len = region_len + self.tokenizer = tokenizer + self.predict_feature = predict_feature + self.num_caps = data_size + self.captions = list(json.load(open(caption_path, 'r')).values()) + self.visualization = visualization + + def __call__(self, data): + + image_feature_wp, image_target_wp, image_location_wp, num_boxes, image_h, image_w, image_id, caption = data + + image_feature = np.zeros((self.region_len, 2048), dtype=np.float32) + image_target = np.zeros((self.region_len, 1601), dtype=np.float32) + image_location = np.zeros((self.region_len, 5), dtype=np.float32) + + num_boxes = int(num_boxes) + image_feature[:num_boxes] = image_feature_wp + image_target[:num_boxes] = image_target_wp + image_location[:num_boxes,:4] = image_location_wp + + image_location[:,4] = (image_location[:,3] - image_location[:,1]) * (image_location[:,2] - image_location[:,0]) / (float(image_w) * float(image_h)) + + image_location[:,0] = image_location[:,0] / float(image_w) + image_location[:,1] = image_location[:,1] / float(image_h) + image_location[:,2] = image_location[:,2] / float(image_w) + image_location[:,3] = image_location[:,3] / float(image_h) + + if self.predict_feature: + image_feature = copy.deepcopy(image_feature) + image_target = copy.deepcopy(image_feature) + else: + image_feature = copy.deepcopy(image_feature) + image_target = copy.deepcopy(image_target) + + caption, label = self.random_cap(caption) + + tokens_caption = self.tokenizer.tokenize(caption) + cur_example = InputExample( + image_feat=image_feature, + image_target=image_target, + caption=tokens_caption, + is_next=label, + image_loc=image_location, + num_boxes=num_boxes + ) + + # transform sample to features + cur_features = self.convert_example_to_features(cur_example, self.seq_len, self.tokenizer, self.region_len) + + cur_tensors = ( + cur_features.input_ids, + cur_features.input_mask, + cur_features.segment_ids, + cur_features.lm_label_ids, + cur_features.is_next, + cur_features.image_feat, + cur_features.image_loc, + cur_features.image_target, + cur_features.image_label, + cur_features.image_mask, + image_id, + ) + return cur_tensors + + def random_cap(self, caption): + """ + Get one sample from corpus consisting of two sentences. With prob. 50% these are two subsequent sentences + from one doc. With 50% the second sentence will be a random one from another doc. + :param index: int, index of sample. + :return: (str, str, int), sentence 1, sentence 2, isNextSentence Label + """ + + if self.visualization: + return caption, 0 + + if random.random() > 0.5: + label = 0 + else: + caption = self.get_random_caption() + label = 1 + + return caption, label + + def get_random_caption(self): + """ + Get random caption from another document for nextSentence task. + :return: str, content of one line + """ + # Similar to original tf repo: This outer loop should rarely go for more than one iteration for large + # corpora. However, just to be careful, we try to make sure that + # the random document is not the same as the document we're processing. + + # add the hard negative mining objective here. + rand_doc_idx = random.randint(0, self.num_caps - 1) + caption = self.captions[rand_doc_idx] + + return caption + + def convert_example_to_features(self, example, max_seq_length, tokenizer, max_region_length): + """ + Convert a raw sample (pair of sentences as tokenized strings) into a proper training sample with + IDs, LM labels, input_mask, CLS and SEP tokens etc. + :param example: InputExample, containing sentence input as strings and is_next label + :param max_seq_length: int, maximum length of sequence. + :param tokenizer: Tokenizer + :return: InputFeatures, containing all inputs and labels of one sample as IDs (as used for model training) + """ + image_feat = example.image_feat + caption = example.caption + image_loc = example.image_loc + image_target = example.image_target + num_boxes = int(example.num_boxes) + self._truncate_seq_pair(caption, max_seq_length - 2) + caption, caption_label = self.random_word(caption, tokenizer) + + image_feat, image_loc, image_label = self.random_region(image_feat, image_loc, num_boxes) + + # concatenate lm labels and account for CLS, SEP, SEP + # lm_label_ids = ([-1] + caption_label + [-1] + image_label + [-1]) + lm_label_ids = [-1] + caption_label + [-1] + # image_label = ([-1] + image_label) + + # The convention in BERT is: + # (a) For sequence pairs: + # tokens: [CLS] is this jack ##son ##ville ? [SEP] no it is not . [SEP] + # type_ids: 0 0 0 0 0 0 0 0 1 1 1 1 1 1 + # (b) For single sequences: + # tokens: [CLS] the dog is hairy . [SEP] + # type_ids: 0 0 0 0 0 0 0 + # + # Where "type_ids" are used to indicate whether this is the first + # sequence or the second sequence. The embedding vectors for `type=0` and + # `type=1` were learned during pre-training and are added to the wordpiece + # embedding vector (and position vector). This is not *strictly* necessary + # since the [SEP] token unambigiously separates the sequences, but it makes + # it easier for the model to learn the concept of sequences. + # + # For classification tasks, the first vector (corresponding to [CLS]) is + # used as as the "sentence vector". Note that this only makes sense because + # the entire model is fine-tuned. + tokens = [] + segment_ids = [] + + tokens.append("[CLS]") + segment_ids.append(0) + # for i in range(36): + # # tokens.append(0) + # segment_ids.append(0) + + # tokens.append("[SEP]") + # segment_ids.append(0) + for token in caption: + tokens.append(token) + segment_ids.append(0) + tokens.append("[SEP]") + segment_ids.append(0) + + input_ids = tokenizer.convert_tokens_to_ids(tokens) + + # The mask has 1 for real tokens and 0 for padding tokens. Only real + # tokens are attended to. + # input_ids = input_ids[:1] input_ids[1:] + input_mask = [1] * (len(input_ids)) + image_mask = [1] * (num_boxes) + # Zero-pad up to the visual sequence length. + while len(image_mask) < max_region_length: + image_mask.append(0) + image_label.append(-1) + + # Zero-pad up to the sequence length. + while len(input_ids) < max_seq_length: + input_ids.append(0) + input_mask.append(0) + segment_ids.append(0) + lm_label_ids.append(-1) + + assert len(input_ids) == max_seq_length + assert len(input_mask) == max_seq_length + assert len(segment_ids) == max_seq_length + assert len(lm_label_ids) == max_seq_length + assert len(image_mask) == max_region_length + assert len(image_label) == max_region_length + + # if example.guid < 5: + # logger.info("*** Example ***") + # logger.info("guid: %s" % (example.guid)) + # logger.info("tokens: %s" % " ".join( + # [str(x) for x in tokens])) + # logger.info("input_ids: %s" % " ".join([str(x) for x in input_ids])) + # logger.info("input_mask: %s" % " ".join([str(x) for x in input_mask])) + # logger.info( + # "segment_ids: %s" % " ".join([str(x) for x in segment_ids])) + # logger.info("LM label: %s " % (lm_label_ids)) + # logger.info("Is next sentence label: %s " % (example.is_next)) + + features = InputFeatures( + input_ids=np.array(input_ids), + input_mask=np.array(input_mask), + segment_ids=np.array(segment_ids), + lm_label_ids=np.array(lm_label_ids), + is_next=np.array(example.is_next), + image_feat=image_feat, + image_target=image_target, + image_loc=image_loc, + image_label=np.array(image_label), + image_mask = np.array(image_mask) + ) + return features + + def _truncate_seq_pair(self, tokens_b, max_length): + """Truncates a sequence pair in place to the maximum length.""" + + # This is a simple heuristic which will always truncate the longer sequence + # one token at a time. This makes more sense than truncating an equal percent + # of tokens from each, since if one sequence is very short then each token + # that's truncated likely contains more information than a longer sequence. + while True: + total_length = len(tokens_b) + if total_length <= max_length: + break + + tokens_b.pop() + + def random_word(self, tokens, tokenizer): + """ + Masking some random tokens for Language Model task with probabilities as in the original BERT paper. + :param tokens: list of str, tokenized sentence. + :param tokenizer: Tokenizer, object used for tokenization (we need it's vocab here) + :return: (list of str, list of int), masked tokens and related labels for LM prediction + """ + output_label = [] + + for i, token in enumerate(tokens): + prob = random.random() + # mask token with 15% probability + + if prob < 0.15 and not self.visualization: + prob /= 0.15 + + # 80% randomly change token to mask token + if prob < 0.8: + tokens[i] = "[MASK]" + + # 10% randomly change token to random token + elif prob < 0.9: + tokens[i] = random.choice(list(tokenizer.vocab.items()))[0] + + # -> rest 10% randomly keep current token + + # append current token to output (we will predict these later) + try: + output_label.append(tokenizer.vocab[token]) + except KeyError: + # For unknown words (should not occur with BPE vocab) + output_label.append(tokenizer.vocab["[UNK]"]) + logger.warning( + "Cannot find token '{}' in vocab. Using [UNK] insetad".format(token) + ) + else: + # no masking token (will be ignored by loss function later) + output_label.append(-1) + + return tokens, output_label + + def random_region(self, image_feat, image_loc, num_boxes): + """ + """ + output_label = [] + + for i in range(num_boxes): + prob = random.random() + # mask token with 15% probability + if prob < 0.15 and not self.visualization: + prob /= 0.15 + + # 80% randomly change token to mask token + if prob < 0.9: + image_feat[i] = 0 + # 10% randomly change token to random token + # elif prob < 0.9: + # tokens[i] = random.choice(list(tokenizer.vocab.items()))[0] + + # -> rest 10% randomly keep current token + # append current token to output (we will predict these later) + output_label.append(1) + else: + # no masking token (will be ignored by loss function later) + output_label.append(-1) + + return image_feat, image_loc, output_label + + + + +class ConceptCapLoaderRetrieval(object): + """ + Data loader. Combines a dataset and a sampler, and provides + single- or multi-process iterators over the dataset. + Arguments: + mode (str, required): mode of dataset to operate in, one of ['train', 'val'] + batch_size (int, optional): how many samples per batch to load + (default: 1). + shuffle (bool, optional): set to ``True`` to have the data reshuffled + at every epoch (default: False). + num_workers (int, optional): how many subprocesses to use for data + loading. 0 means that the data will be loaded in the main process + (default: 0) + cache (int, optional): cache size to use when loading data, + drop_last (bool, optional): set to ``True`` to drop the last incomplete batch, + if the dataset size is not divisible by the batch size. If ``False`` and + the size of dataset is not divisible by the batch size, then the last batch + will be smaller. (default: False) + cuda (bool, optional): set to ``True`` and the PyTorch tensors will get preloaded + to the GPU for you (necessary because this lets us to uint8 conversion on the + GPU, which is faster). + """ + + def __init__( + self, + corpus_path, + tokenizer, + seq_len, + encoding="utf-8", + predict_feature=False, + batch_size=512, + shuffle=False, + num_workers=10, + cache=50000, + drop_last=False, + cuda=False, + ): + + lmdb_file = "/coc/dataset/conceptual_caption/validation_feat_all.lmdb" + if not os.path.exists(lmdb_file): + lmdb_file = "/coc/pskynet2/jlu347/multi-modal-bert/data/conceptual_caption/validation_feat_all.lmdb" + caption_path = "/coc/pskynet2/jlu347/multi-modal-bert/data/conceptual_caption/caption_val.json" + + print("Loading from %s" % lmdb_file) + + ds = td.LMDBSerializer.load(lmdb_file, shuffle=False) + self.num_dataset = len(ds) + preprocess_function = BertPreprocessRetrieval( + caption_path, + tokenizer, + seq_len, + 36, + 1000, + encoding="utf-8", + predict_feature=predict_feature, + ) + + ds = td.MapData(ds, preprocess_function) + self.ds = td.BatchData(ds, 1) + self.ds.reset_state() + + self.batch_size = 1 + self.num_workers = num_workers + self._entry = [] + + self.features_all = np.zeros((1000, 37, 2048), dtype=np.float32) + self.spatials_all = np.zeros((1000, 37, 5), dtype=np.float32) + self.image_mask_all = np.zeros((1000, 37), dtype=np.float32) + self.image_ids = [] + # load first 1000 file here. + for i, batch in enumerate(self.ds.get_data()): + if i >= 1000: + break + input_ids, input_mask, segment_ids, is_next, image_feat, \ + image_loc, image_mask, image_id, caption = batch + + batch_size = input_ids.shape[0] + g_image_feat = np.sum(image_feat, axis=1) / np.sum(image_mask, axis=1, keepdims=True) + image_feat = np.concatenate([np.expand_dims(g_image_feat, axis=1), image_feat], axis=1) + image_feat = np.array(image_feat, dtype=np.float32) + + g_image_loc = np.repeat(np.array([[0,0,1,1,1]], dtype=np.float32), batch_size, axis=0) + image_loc = np.concatenate([np.expand_dims(g_image_loc, axis=1), image_loc], axis=1) + + image_loc = np.array(image_loc, dtype=np.float32) + g_image_mask = np.repeat(np.array([[1]]), batch_size, axis=0) + image_mask = np.concatenate([g_image_mask, image_mask], axis=1) + + batch = (input_ids, input_mask, segment_ids, image_id, caption) + self._entry.append(batch) + + self.features_all[i] = image_feat + self.image_mask_all[i] = np.array(image_mask) + self.spatials_all[i] = image_loc + self.image_ids.append(image_id) + sys.stdout.write('%d/%d\r' % (i, 1000)) + sys.stdout.flush() + + + def __iter__(self): + + for index in range(self.__len__()): + caption_idx = int(index / 2) + image_idx = index % 2 + + if image_idx == 0: + image_entries = self.image_ids[:500] + features_all = self.features_all[:500] + spatials_all = self.spatials_all[:500] + image_mask_all = self.image_mask_all[:500] + + else: + image_entries = self.image_ids[500:] + features_all = self.features_all[500:] + spatials_all = self.spatials_all[500:] + image_mask_all = self.image_mask_all[500:] + + caption, input_mask, segment_ids, txt_image_id, caption = self._entry[caption_idx] + target_all = np.zeros((500)) + for i, image_id in enumerate(image_entries): + if image_id == txt_image_id: + target_all[i] = 1 + + batch = (features_all, spatials_all, image_mask_all, caption, input_mask, segment_ids, target_all, caption_idx, image_idx) + batch = [torch.tensor(data) for data in batch] + batch.append(txt_image_id) + batch.append(caption) + + yield batch + + + def __len__(self): + return len(self._entry) * 2 + + +class BertPreprocessRetrieval(object): + def __init__( + self, + caption_path, + tokenizer, + seq_len, + region_len, + data_size, + split="Train", + encoding="utf-8", + predict_feature=False, + ): + + self.split = split + self.seq_len = seq_len + self.region_len = region_len + self.tokenizer = tokenizer + self.predict_feature = predict_feature + self.num_caps = data_size + self.captions = list(json.load(open(caption_path, 'r')).values())[:data_size] + + def __call__(self, data): + + image_feature_wp, image_target_wp, image_location_wp, num_boxes, image_h, image_w, image_id, caption = data + + image_feature = np.zeros((self.region_len, 2048), dtype=np.float32) + image_target = np.zeros((self.region_len, 1601), dtype=np.float32) + image_location = np.zeros((self.region_len, 5), dtype=np.float32) + + num_boxes = int(num_boxes) + image_feature[:num_boxes] = image_feature_wp + image_target[:num_boxes] = image_target_wp + image_location[:num_boxes,:4] = image_location_wp + + image_location[:,4] = (image_location[:,3] - image_location[:,1]) * (image_location[:,2] - image_location[:,0]) / (float(image_w) * float(image_h)) + + image_location[:,0] = image_location[:,0] / float(image_w) + image_location[:,1] = image_location[:,1] / float(image_h) + image_location[:,2] = image_location[:,2] / float(image_w) + image_location[:,3] = image_location[:,3] / float(image_h) + + label = 0 + + tokens_caption = self.tokenizer.tokenize(caption) + cur_example = InputExample( + image_feat=image_feature, + image_target=image_target, + caption=tokens_caption, + is_next=label, + image_loc=image_location, + num_boxes=num_boxes + ) + + # transform sample to features + cur_features = self.convert_example_to_features(cur_example, self.seq_len, self.tokenizer, self.region_len) + + cur_tensors = ( + cur_features.input_ids, + cur_features.input_mask, + cur_features.segment_ids, + cur_features.is_next, + cur_features.image_feat, + cur_features.image_loc, + cur_features.image_mask, + float(image_id), + caption, + ) + return cur_tensors + + + def convert_example_to_features(self, example, max_seq_length, tokenizer, max_region_length): + """ + Convert a raw sample (pair of sentences as tokenized strings) into a proper training sample with + IDs, LM labels, input_mask, CLS and SEP tokens etc. + :param example: InputExample, containing sentence input as strings and is_next label + :param max_seq_length: int, maximum length of sequence. + :param tokenizer: Tokenizer + :return: InputFeatures, containing all inputs and labels of one sample as IDs (as used for model training) + """ + image_feat = example.image_feat + caption = example.caption + image_loc = example.image_loc + # image_target = example.image_target + num_boxes = int(example.num_boxes) + self._truncate_seq_pair(caption, max_seq_length - 2) + # caption, caption_label = self.random_word(caption, tokenizer) + caption_label = None + # image_feat, image_loc, image_label = self.random_region(image_feat, image_loc, num_boxes) + image_label = None + + tokens = [] + segment_ids = [] + + tokens.append("[CLS]") + segment_ids.append(0) + + for token in caption: + tokens.append(token) + segment_ids.append(0) + tokens.append("[SEP]") + segment_ids.append(0) + + input_ids = tokenizer.convert_tokens_to_ids(tokens) + + # The mask has 1 for real tokens and 0 for padding tokens. Only real + # tokens are attended to. + # input_ids = input_ids[:1] input_ids[1:] + input_mask = [1] * (len(input_ids)) + image_mask = [1] * (num_boxes) + # Zero-pad up to the visual sequence length. + while len(image_mask) < max_region_length: + image_mask.append(0) + + # Zero-pad up to the sequence length. + while len(input_ids) < max_seq_length: + input_ids.append(0) + input_mask.append(0) + segment_ids.append(0) + + assert len(input_ids) == max_seq_length + assert len(input_mask) == max_seq_length + assert len(segment_ids) == max_seq_length + assert len(image_mask) == max_region_length + + features = InputFeatures( + input_ids=np.array(input_ids), + input_mask=np.array(input_mask), + segment_ids=np.array(segment_ids), + is_next=np.array(example.is_next), + image_feat=image_feat, + image_loc=image_loc, + image_mask = np.array(image_mask), + ) + return features + + def _truncate_seq_pair(self, tokens_b, max_length): + """Truncates a sequence pair in place to the maximum length.""" + + # This is a simple heuristic which will always truncate the longer sequence + # one token at a time. This makes more sense than truncating an equal percent + # of tokens from each, since if one sequence is very short then each token + # that's truncated likely contains more information than a longer sequence. + while True: + total_length = len(tokens_b) + if total_length <= max_length: + break + + tokens_b.pop() diff --git a/vilbert/datasets/refer_expression_dataset.py b/vilbert/datasets/refer_expression_dataset.py new file mode 100644 index 0000000..04edcbf --- /dev/null +++ b/vilbert/datasets/refer_expression_dataset.py @@ -0,0 +1,222 @@ +import json +from typing import Any, Dict, List +import random +import os + +import torch +from torch.utils.data import Dataset +import numpy as np + +from pytorch_pretrained_bert.tokenization import BertTokenizer +from ._image_features_reader import ImageFeaturesH5Reader +import _pickle as cPickle + +import sys +#sys.path.append("tools/refer") +from tools.refer.refer import REFER + +def iou(anchors, gt_boxes): + """ + anchors: (N, 4) ndarray of float + gt_boxes: (K, 4) ndarray of float + overlaps: (N, K) ndarray of overlap between boxes and query_boxes + """ + N = anchors.size(0) + K = gt_boxes.size(0) + + gt_boxes_area = ((gt_boxes[:,2] - gt_boxes[:,0] + 1) * + (gt_boxes[:,3] - gt_boxes[:,1] + 1)).view(1, K) + + anchors_area = ((anchors[:,2] - anchors[:,0] + 1) * + (anchors[:,3] - anchors[:,1] + 1)).view(N, 1) + + boxes = anchors.view(N, 1, 4).expand(N, K, 4) + query_boxes = gt_boxes.view(1, K, 4).expand(N, K, 4) + + iw = (torch.min(boxes[:,:,2], query_boxes[:,:,2]) - + torch.max(boxes[:,:,0], query_boxes[:,:,0]) + 1) + iw[iw < 0] = 0 + + ih = (torch.min(boxes[:,:,3], query_boxes[:,:,3]) - + torch.max(boxes[:,:,1], query_boxes[:,:,1]) + 1) + ih[ih < 0] = 0 + + ua = anchors_area + gt_boxes_area - (iw * ih) + overlaps = iw * ih / ua + + return overlaps + +def assert_eq(real, expected): + assert real == expected, "%s (true) vs %s (expected)" % (real, expected) + +class ReferExpressionDataset(Dataset): + def __init__( + self, + task: str, + dataroot: str, + annotations_jsonpath: str, + split: str, + image_features_reader: ImageFeaturesH5Reader, + gt_image_features_reader: ImageFeaturesH5Reader, + tokenizer: BertTokenizer, + padding_index: int = 0, + max_seq_length: int = 20, + max_region_num: int = 60 + ): + self.split = split + self.refer = REFER(dataroot, dataset=task, splitBy='unc') + self.ref_ids = self.refer.getRefIds(split=split) + print('%s refs are in split [%s].' % (len(self.ref_ids), split)) + + self.num_labels = 1 + self._image_features_reader = image_features_reader + self._gt_image_features_reader = gt_image_features_reader + self._tokenizer = tokenizer + + self._padding_index = padding_index + self._max_seq_length = max_seq_length + self.entries = self._load_annotations() + + self.max_region_num = max_region_num + + cache_path = os.path.join(dataroot, "cache", task + '_' + split + '_' + str(max_seq_length)+ "_" + str(max_region_num) + '.pkl') + if not os.path.exists(cache_path): + self.tokenize() + self.tensorize() + cPickle.dump(self.entries, open(cache_path, 'wb')) + else: + print('loading entries from %s' %(cache_path)) + self.entries = cPickle.load(open(cache_path, "rb")) + + def _load_annotations(self): + + # annotations_json: Dict[str, Any] = json.load(open(annotations_jsonpath)) + + # Build an index which maps image id with a list of caption annotations. + entries = [] + + for ref_id in self.ref_ids: + ref = self.refer.Refs[ref_id] + image_id = ref['image_id'] + ref_id = ref['ref_id'] + refBox = self.refer.getRefBox(ref_id) + for sent, sent_id in zip(ref['sentences'], ref['sent_ids']): + caption = sent['raw'] + entries.append( + {"caption": caption, 'sent_id':sent_id, 'image_id':image_id, \ + "refBox": refBox, 'ref_id': ref_id} + ) + + return entries + + def tokenize(self): + """Tokenizes the captions. + + This will add caption_tokens in each entry of the dataset. + -1 represents nil, and should be treated as padding_idx in embedding. + """ + for entry in self.entries: + + sentence_tokens = self._tokenizer.tokenize(entry["caption"]) + sentence_tokens = ["[CLS]"] + sentence_tokens + ["[SEP]"] + + tokens = [ + self._tokenizer.vocab.get(w, self._tokenizer.vocab["[UNK]"]) + for w in sentence_tokens + ] + + tokens = tokens[:self._max_seq_length] + segment_ids = [0] * len(tokens) + input_mask = [1] * len(tokens) + + if len(tokens) < self._max_seq_length: + # Note here we pad in front of the sentence + padding = [self._padding_index] * (self._max_seq_length - len(tokens)) + tokens = tokens + padding + input_mask += padding + segment_ids += padding + + assert_eq(len(tokens), self._max_seq_length) + entry["token"] = tokens + entry["input_mask"] = input_mask + entry["segment_ids"] = segment_ids + + def tensorize(self): + + for entry in self.entries: + token = torch.from_numpy(np.array(entry["token"])) + entry["token"] = token + + input_mask = torch.from_numpy(np.array(entry["input_mask"])) + entry["input_mask"] = input_mask + + segment_ids = torch.from_numpy(np.array(entry["segment_ids"])) + entry["segment_ids"] = segment_ids + + + def __getitem__(self, index): + entry = self.entries[index] + + image_id = entry["image_id"] + ref_box = entry["refBox"] + + ref_box = [ref_box[0], ref_box[1], ref_box[0]+ref_box[2], ref_box[1]+ref_box[3]] + features, num_boxes, boxes, boxes_ori = self._image_features_reader[image_id] + + boxes_ori = boxes_ori[:num_boxes] + boxes = boxes[:num_boxes] + features = features[:num_boxes] + + if self.split == 'train': + gt_features, gt_num_boxes, gt_boxes, gt_boxes_ori = self._gt_image_features_reader[image_id] + + # merge two boxes, and assign the labels. + gt_boxes_ori = gt_boxes_ori[1:gt_num_boxes] + gt_boxes = gt_boxes[1:gt_num_boxes] + gt_features = gt_features[1:gt_num_boxes] + + # concatenate the boxes + mix_boxes_ori = np.concatenate((boxes_ori, gt_boxes_ori), axis=0) + mix_boxes = np.concatenate((boxes, gt_boxes), axis=0) + mix_features = np.concatenate((features, gt_features), axis=0) + mix_num_boxes = min(int(num_boxes + int(gt_num_boxes) - 1), self.max_region_num) + # given the mix boxes, and ref_box, calculate the overlap. + mix_target = iou(torch.tensor(mix_boxes_ori[:,:4]).float(), torch.tensor([ref_box]).float()) + mix_target[mix_target<0.5] = 0 + + else: + mix_boxes_ori = boxes_ori + mix_boxes = boxes + mix_features = features + mix_num_boxes = min(int(num_boxes), self.max_region_num) + mix_target = iou(torch.tensor(mix_boxes_ori[:,:4]).float(), torch.tensor([ref_box]).float()) + + image_mask = [1] * (mix_num_boxes) + while len(image_mask) < self.max_region_num: + image_mask.append(0) + + mix_boxes_pad = np.zeros((self.max_region_num, 5)) + mix_features_pad = np.zeros((self.max_region_num, 2048)) + + mix_boxes_pad[:mix_num_boxes] = mix_boxes[:mix_num_boxes] + mix_features_pad[:mix_num_boxes] = mix_features[:mix_num_boxes] + + # appending the target feature. + features = torch.tensor(mix_features_pad).float() + image_mask = torch.tensor(image_mask).long() + spatials = torch.tensor(mix_boxes_pad).float() + + target = torch.zeros((self.max_region_num,1)).float() + target[:mix_num_boxes] = mix_target + + spatials_ori = torch.tensor(mix_boxes_ori).float() + co_attention_mask = torch.zeros((self.max_region_num, self._max_seq_length)) + + caption = entry["token"] + input_mask = entry["input_mask"] + segment_ids = entry["segment_ids"] + + return features, spatials, image_mask, caption, target, input_mask, segment_ids, co_attention_mask, image_id + + def __len__(self): + return len(self.entries) diff --git a/vilbert/datasets/retreival_dataset.py b/vilbert/datasets/retreival_dataset.py new file mode 100644 index 0000000..332c000 --- /dev/null +++ b/vilbert/datasets/retreival_dataset.py @@ -0,0 +1,393 @@ +import json +from typing import Any, Dict, List +import random +import os + +import torch +from torch.utils.data import Dataset +import numpy as np +import _pickle as cPickle + +from pytorch_pretrained_bert.tokenization import BertTokenizer +from ._image_features_reader import ImageFeaturesH5Reader +import jsonlines +import sys +import pdb + +def assert_eq(real, expected): + assert real == expected, "%s (true) vs %s (expected)" % (real, expected) + +def _load_annotations(annotations_jsonpath, task): + + with jsonlines.open(annotations_jsonpath) as reader: + + # Build an index which maps image id with a list of caption annotations. + entries = [] + imgid2entry = {} + count = 0 + + for annotation in reader: + if task == 'RetrievalCOCO': + image_id = annotation['id'] + elif task == 'RetrievalFlickr30k': + image_id = int(annotation['img_path'].split('.')[0]) + imgid2entry[image_id] = [] + for sentences in annotation['sentences']: + entries.append({"caption": sentences, 'image_id':image_id}) + imgid2entry[image_id].append(count) + count += 1 + + return entries, imgid2entry + + +class RetreivalDataset(Dataset): + def __init__( + self, + task: str, + dataroot: str, + annotations_jsonpath: str, + split: str, + image_features_reader: ImageFeaturesH5Reader, + gt_image_features_reader: ImageFeaturesH5Reader, + tokenizer: BertTokenizer, + padding_index: int = 0, + max_seq_length: int = 20, + max_region_num: int = 37, + ): + # All the keys in `self._entries` would be present in `self._image_features_reader` + + self._entries, self.imgid2entry = _load_annotations(annotations_jsonpath, task) + self.image_id_list = [*self.imgid2entry] + + self._image_features_reader = image_features_reader + self._tokenizer = tokenizer + self.num_labels = 1 + self._split = split + self._padding_index = padding_index + self._max_region_num = max_region_num + self._max_seq_length = max_seq_length + + if self._split == 'train': + image_info = cPickle.load(open(os.path.join(dataroot, 'hard_negative.pkl'), 'rb')) + for key, value in image_info.items(): + setattr(self, key, value) + self.train_imgId2pool = {imageId:i for i, imageId in enumerate(self.train_image_list)} + + cache_path = os.path.join(dataroot, "cache", task + '_' + split + '_' + str(max_seq_length)+'.pkl') + + if not os.path.exists(cache_path): + self.tokenize() + self.tensorize() + cPickle.dump(self._entries, open(cache_path, 'wb')) + else: + print('loading entries from %s' %(cache_path)) + self._entries = cPickle.load(open(cache_path, "rb")) + + def tokenize(self): + """Tokenizes the captions. + + This will add caption_tokens in each entry of the dataset. + -1 represents nil, and should be treated as padding_idx in embedding. + """ + for entry in self._entries: + sentence_tokens = self._tokenizer.tokenize(entry["caption"]) + sentence_tokens = ["[CLS]"] + sentence_tokens + ["[SEP]"] + + tokens = [ + self._tokenizer.vocab.get(w, self._tokenizer.vocab["[UNK]"]) + for w in sentence_tokens + ] + tokens = tokens[:self._max_seq_length] + segment_ids = [0] * len(tokens) + input_mask = [1] * len(tokens) + + if len(tokens) < self._max_seq_length: + # Note here we pad in front of the sentence + padding = [self._padding_index] * (self._max_seq_length - len(tokens)) + tokens = tokens + padding + input_mask += padding + segment_ids += padding + + assert_eq(len(tokens), self._max_seq_length) + entry["token"] = tokens + entry["input_mask"] = input_mask + entry["segment_ids"] = segment_ids + + def tensorize(self): + + for entry in self._entries: + token = torch.from_numpy(np.array(entry["token"])) + entry["token"] = token + + input_mask = torch.from_numpy(np.array(entry["input_mask"])) + entry["input_mask"] = input_mask + + segment_ids = torch.from_numpy(np.array(entry["segment_ids"])) + entry["segment_ids"] = segment_ids + + + def __getitem__(self, index): + entry = self._entries[index] + image_id = entry["image_id"] + + features, num_boxes, boxes, _ = self._image_features_reader[image_id] + + mix_num_boxes = min(int(num_boxes), self._max_region_num) + mix_boxes_pad = np.zeros((self._max_region_num, 5)) + mix_features_pad = np.zeros((self._max_region_num, 2048)) + + image_mask = [1] * (int(mix_num_boxes)) + while len(image_mask) < self._max_region_num: + image_mask.append(0) + + mix_boxes_pad[:mix_num_boxes] = boxes[:mix_num_boxes] + mix_features_pad[:mix_num_boxes] = features[:mix_num_boxes] + + features1 = torch.tensor(mix_features_pad).float() + image_mask1 = torch.tensor(image_mask).long() + spatials1 = torch.tensor(mix_boxes_pad).float() + + caption1 = entry["token"] + input_mask1 = entry["input_mask"] + segment_ids1 = entry["segment_ids"] + # negative samples. + # 1: correct one, 2: random caption wrong, 3: random image wrong. 4: hard image wrong. + + while True: + # sample a random image: + img_id2 = random.choice(self.image_id_list) + if img_id2 != image_id: break + + entry2 = self._entries[random.choice(self.imgid2entry[img_id2])] + + features2 = features1 + image_mask2 = image_mask1 + spatials2 = spatials1 + caption2 = entry2["token"] + input_mask2 = entry2["input_mask"] + segment_ids2 = entry2["segment_ids"] + + # random image wrong + while True: + # sample a random image: + img_id3 = random.choice(self.image_id_list) + if img_id3 != image_id: break + + features3, num_boxes3, boxes3, _ = self._image_features_reader[img_id3] + image_mask3 = [1] * (int(num_boxes3)) + + mix_num_boxes3 = min(int(num_boxes3), self._max_region_num) + mix_boxes_pad3 = np.zeros((self._max_region_num, 5)) + mix_features_pad3 = np.zeros((self._max_region_num, 2048)) + + while len(image_mask3) < self._max_region_num: + image_mask3.append(0) + + + mix_boxes_pad[:mix_num_boxes3] = boxes3[:mix_num_boxes3] + mix_features_pad[:mix_num_boxes3] = features3[:mix_num_boxes3] + + features3 = torch.tensor(mix_features_pad).float() + image_mask3 = torch.tensor(image_mask3).long() + spatials3 = torch.tensor(mix_boxes_pad).float() + + caption3 = caption1 + input_mask3 = input_mask1 + segment_ids3 = segment_ids1 + + + if self._split == 'train': + # random hard caption. + rand_img_id_pool = self.train_hard_pool[self.train_imgId2pool[image_id]] + pool_img_idx = int(rand_img_id_pool[np.random.randint(1, len(rand_img_id_pool))]) + img_id4 = self.train_image_list[pool_img_idx] + else: + while True: + # sample a random image: + img_id4 = random.choice(self.image_id_list) + if img_id4 != image_id: break + + entry4 = self._entries[random.choice(self.imgid2entry[img_id4])] + + features4 = features1 + image_mask4 = image_mask1 + spatials4 = spatials1 + caption4 = entry4["token"] + input_mask4 = entry4["input_mask"] + segment_ids4 = entry4["segment_ids"] + + features = torch.stack([features1, features2, features3, features4], dim=0) + spatials = torch.stack([spatials1, spatials2, spatials3, spatials4], dim=0) + image_mask = torch.stack([image_mask1, image_mask2, image_mask3, image_mask4], dim=0) + caption = torch.stack([caption1, caption2, caption3, caption4], dim=0) + input_mask = torch.stack([input_mask1, input_mask2, input_mask3, input_mask4], dim=0) + segment_ids = torch.stack([segment_ids1, segment_ids2, segment_ids3, segment_ids4], dim=0) + co_attention_mask = torch.zeros((4, self._max_region_num, self._max_seq_length)) + target = 0 + + return features, spatials, image_mask, caption, target, input_mask, segment_ids, co_attention_mask, image_id + + def __len__(self): + return len(self._entries) + +def _load_annotationsVal(annotations_jsonpath, task): + + with jsonlines.open(annotations_jsonpath) as reader: + + # Build an index which maps image id with a list of caption annotations. + image_entries = {} + caption_entries = [] + + for annotation in reader: + if task == 'RetrievalCOCO': + image_id = annotation['id'] + elif task == 'RetrievalFlickr30k': + image_id = int(annotation['img_path'].split('.')[0]) + + image_entries[image_id] = 1 + + for sentences in annotation['sentences']: + caption_entries.append({"caption": sentences, 'image_id':image_id}) + + image_entries = [*image_entries] + + return image_entries, caption_entries + +class RetreivalDatasetVal(Dataset): + def __init__( + self, + task: str, + dataroot: str, + annotations_jsonpath: str, + split: str, + image_features_reader: ImageFeaturesH5Reader, + gt_image_features_reader: ImageFeaturesH5Reader, + tokenizer: BertTokenizer, + padding_index: int = 0, + max_seq_length: int = 20, + max_region_num: int = 101, + ): + # All the keys in `self._entries` would be present in `self._image_features_reader` + + self._image_entries, self._caption_entries = _load_annotationsVal(annotations_jsonpath, task) + self._image_features_reader = image_features_reader + self._tokenizer = tokenizer + + self._split = split + self._padding_index = padding_index + self._max_region_num = max_region_num + self._max_seq_length = max_seq_length + self.num_labels = 1 + + # cache file path data/cache/train_ques + # cap_cache_path = "data/cocoRetreival/cache/val_cap.pkl" + # if not os.path.exists(cap_cache_path): + self.tokenize() + self.tensorize() + # cPickle.dump(self._entries, open(cap_cache_path, 'wb')) + # else: + # print('loading entries from %s' %(cap_cache_path)) + # self._entries = cPickle.load(open(cap_cache_path, "rb")) +# + self.features_all = np.zeros((1000, self._max_region_num, 2048)) + self.spatials_all = np.zeros((1000, self._max_region_num, 5)) + self.image_mask_all = np.zeros((1000, self._max_region_num)) + + for i, image_id in enumerate(self._image_entries): + features, num_boxes, boxes, _ = self._image_features_reader[image_id] + + mix_num_boxes = min(int(num_boxes), self._max_region_num) + mix_boxes_pad = np.zeros((self._max_region_num, 5)) + mix_features_pad = np.zeros((self._max_region_num, 2048)) + + image_mask = [1] * (int(mix_num_boxes)) + while len(image_mask) < self._max_region_num: + image_mask.append(0) + + mix_boxes_pad[:mix_num_boxes] = boxes[:mix_num_boxes] + mix_features_pad[:mix_num_boxes] = features[:mix_num_boxes] + + self.features_all[i] = mix_features_pad + self.image_mask_all[i] = np.array(image_mask) + self.spatials_all[i] = mix_boxes_pad + + sys.stdout.write('%d/%d\r' % (i, len(self._image_entries))) + sys.stdout.flush() + + self.features_all = torch.Tensor(self.features_all).float() + self.image_mask_all = torch.Tensor(self.image_mask_all).long() + self.spatials_all = torch.Tensor(self.spatials_all).float() + + def tokenize(self): + """Tokenizes the captions. + + This will add caption_tokens in each entry of the dataset. + -1 represents nil, and should be treated as padding_idx in embedding. + """ + for entry in self._caption_entries: + sentence_tokens = self._tokenizer.tokenize(entry["caption"]) + sentence_tokens = ["[CLS]"] + sentence_tokens + ["[SEP]"] + + tokens = [ + self._tokenizer.vocab.get(w, self._tokenizer.vocab["[UNK]"]) + for w in sentence_tokens + ] + tokens = tokens[:self._max_seq_length] + segment_ids = [0] * len(tokens) + input_mask = [1] * len(tokens) + + if len(tokens) < self._max_seq_length: + # Note here we pad in front of the sentence + padding = [self._padding_index] * (self._max_seq_length - len(tokens)) + tokens = tokens + padding + input_mask += padding + segment_ids += padding + + assert_eq(len(tokens), self._max_seq_length) + entry["token"] = tokens + entry["input_mask"] = input_mask + entry["segment_ids"] = segment_ids + + def tensorize(self): + for entry in self._caption_entries: + token = torch.from_numpy(np.array(entry["token"])).long() + entry["token"] = token + + input_mask = torch.from_numpy(np.array(entry["input_mask"])) + entry["input_mask"] = input_mask + + segment_ids = torch.from_numpy(np.array(entry["segment_ids"])).long() + entry["segment_ids"] = segment_ids + + def __getitem__(self, index): + + # we iterate through every caption here. + caption_idx = int(index / 2) + image_idx = index % 2 + + if image_idx == 0: + image_entries = self._image_entries[:500] + features_all = self.features_all[:500] + spatials_all = self.spatials_all[:500] + image_mask_all = self.image_mask_all[:500] + + else: + image_entries = self._image_entries[500:] + features_all = self.features_all[500:] + spatials_all = self.spatials_all[500:] + image_mask_all = self.image_mask_all[500:] + + entry = self._caption_entries[caption_idx] + caption = entry["token"] + input_mask = entry["input_mask"] + segment_ids = entry["segment_ids"] + + target_all = torch.zeros(500) + for i, image_id in enumerate(image_entries): + if image_id == entry["image_id"]: + target_all[i] = 1 + + return features_all, spatials_all, image_mask_all, caption, input_mask, segment_ids, target_all, caption_idx, image_idx + + def __len__(self): + return len(self._caption_entries) * 2 diff --git a/vilbert/datasets/vcr_dataset.py b/vilbert/datasets/vcr_dataset.py new file mode 100644 index 0000000..0c7dfb4 --- /dev/null +++ b/vilbert/datasets/vcr_dataset.py @@ -0,0 +1,337 @@ +import json +from typing import Any, Dict, List +import random +import os + +import torch +from torch.utils.data import Dataset +import numpy as np +import _pickle as cPickle +import json_lines + +from pytorch_pretrained_bert.tokenization import BertTokenizer +from ._image_features_reader import ImageFeaturesH5Reader +import pdb +import csv +import sys + +def assert_eq(real, expected): + assert real == expected, "%s (true) vs %s (expected)" % (real, expected) + +def _converId(img_id): + + img_id = img_id.split('-') + if 'train' in img_id[0]: + new_id = int(img_id[1]) + elif 'val' in img_id[0]: + new_id = int(img_id[1]) + 1000000 + elif 'test' in img_id[0]: + new_id = int(img_id[1]) + 2000000 + else: + pdb.set_trace() + + return new_id + + +def _load_annotationsQ_A(annotations_jsonpath, split): + """Build an index out of FOIL annotations, mapping each image ID with its corresponding captions.""" + entries = [] + with open(annotations_jsonpath, 'rb') as f: # opening file in binary(rb) mode + for annotation in json_lines.reader(f): + # metadata_fn = json.load(open(os.path.join('data/VCR/vcr1images', annotation["metadata_fn"]), 'r')) + # det_names = metadata_fn["names"] + det_names = "" + question = annotation["question"] + if split == 'test': + ans_label = 0 + else: + ans_label = annotation["answer_label"] + img_id = _converId(annotation["img_id"]) + anno_id = int(annotation["annot_id"].split('-')[1]) + entries.append( + {"question": question, 'answers':annotation["answer_choices"], "metadata_fn": annotation["metadata_fn"], 'target':ans_label, 'img_id':img_id, 'anno_id':anno_id} + ) + + return entries + +def _load_annotationsQA_R(annotations_jsonpath, split): + """Build an index out of FOIL annotations, mapping each image ID with its corresponding captions.""" + entries = [] + with open(annotations_jsonpath, 'rb') as f: # opening file in binary(rb) mode + for annotation in json_lines.reader(f): + # metadata_fn = json.load(open(os.path.join('data/VCR/vcr1images', annotation["metadata_fn"]), 'r')) + # det_names = metadata_fn["names"] + if split == 'test': + # for each answer + for answer in annotation["answer_choices"]: + question = annotation["question"] + ["[SEP]"] + answer + img_id = _converId(annotation["img_id"]) + ans_label = 0 + anno_id = int(annotation["annot_id"].split('-')[1]) + entries.append( + {"question": question, 'answers':annotation["rationale_choices"], "metadata_fn": annotation["metadata_fn"], 'target':ans_label, 'img_id':img_id} + ) + else: + det_names = "" + question = annotation["question"] + ["[SEP]"] + annotation["answer_choices"][annotation['answer_label']] + ans_label = annotation["rationale_label"] + # img_fn = annotation["img_fn"] + img_id = _converId(annotation["img_id"]) + anno_id = int(annotation["annot_id"].split('-')[1]) + entries.append( + {"question": question, 'answers':annotation["rationale_choices"], "metadata_fn": annotation["metadata_fn"], 'target':ans_label, 'img_id':img_id, 'anno_id':anno_id} + ) + + return entries + +class VCRDataset(Dataset): + def __init__( + self, + task: str, + dataroot: str, + annotations_jsonpath: str, + split: str, + image_features_reader: ImageFeaturesH5Reader, + gt_image_features_reader: ImageFeaturesH5Reader, + tokenizer: BertTokenizer, + padding_index: int = 0, + max_seq_length: int = 40, + max_region_num: int = 60 + ): + # All the keys in `self._entries` would be present in `self._image_features_reader` + if task == 'VCR_Q-A': + self._entries = _load_annotationsQ_A(annotations_jsonpath, split) + elif task == "VCR_QA-R": + self._entries = _load_annotationsQA_R(annotations_jsonpath, split) + else: + assert False + self._split = split + self._image_features_reader = image_features_reader + self._gt_image_features_reader = gt_image_features_reader + self._tokenizer = tokenizer + + self._padding_index = padding_index + self._max_caption_length = max_seq_length + self._max_region_num = max_region_num + self.num_labels = 1 + + self._names = [] + with open('data/VCR/unisex_names_table.csv') as csv_file: + csv_reader = csv.reader(csv_file, delimiter=',') + for row in csv_reader: + if row[1] != 'name': + self._names.append(row[1]) + + # cache file path data/cache/train_ques + cache_path = "data/VCR/cache/" + split + '_' + task + "_" + str(max_seq_length) + "_" + str(max_region_num) + "_vcr.pkl" + if not os.path.exists(cache_path): + self.tokenize() + self.tensorize() + cPickle.dump(self._entries, open(cache_path, 'wb')) + else: + self._entries = cPickle.load(open(cache_path, "rb")) + + def tokenize(self): + """Tokenizes the captions. + + This will add caption_tokens in each entry of the dataset. + -1 represents nil, and should be treated as padding_idx in embedding. + """ + count = 0 + for entry in self._entries: + metadata_fn = json.load(open(os.path.join('data/VCR/vcr1images', entry["metadata_fn"]), 'r')) + det_names = metadata_fn["names"] + random_names = self.generate_random_name(det_names) + # replace with name + tokens_a, mask_a = self.replace_det_with_name(entry["question"], random_names) + + input_ids_all = [] + co_attention_mask_all = [] + input_mask_all = [] + segment_ids_all = [] + + for answer in entry["answers"]: + tokens_b, mask_b = self.replace_det_with_name(answer, random_names) + + self._truncate_seq_pair(tokens_a, tokens_b, mask_a, mask_b, self._max_caption_length - 3) + + tokens = [] + segment_ids = [] + tokens.append("[CLS]") + segment_ids.append(0) + + for token in tokens_a: + tokens.append(token) + segment_ids.append(0) + + tokens.append("[SEP]") + segment_ids.append(0) + + assert len(tokens_b) > 0 + for token in tokens_b: + tokens.append(token) + segment_ids.append(1) + tokens.append("[SEP]") + segment_ids.append(1) + + input_ids = self._tokenizer.convert_tokens_to_ids(tokens) + co_attention_mask = [-1] + mask_a + [-1] + mask_b + [-1] + + input_mask = [1] * len(input_ids) + # Zero-pad up to the sequence length. + while len(input_ids) < self._max_caption_length: + input_ids.append(0) + input_mask.append(0) + segment_ids.append(0) + co_attention_mask.append(-1) + + assert len(input_ids) == self._max_caption_length + assert len(input_mask) == self._max_caption_length + assert len(segment_ids) == self._max_caption_length + + co_attention_mask_all.append(co_attention_mask) + input_ids_all.append(input_ids) + input_mask_all.append(input_mask) + segment_ids_all.append(segment_ids) + + entry["co_attention_mask"] = co_attention_mask_all + entry["input_ids"] = input_ids_all + entry["input_mask"] = input_mask_all + entry["segment_ids"] = segment_ids_all + + sys.stdout.write('%d/%d\r' % (count, len(self._entries))) + sys.stdout.flush() + count += 1 + + def tensorize(self): + + for entry in self._entries: + input_ids = torch.from_numpy(np.array(entry["input_ids"])) + entry["input_ids"] = input_ids + + input_mask = torch.from_numpy(np.array(entry["input_mask"])) + entry["input_mask"] = input_mask + + segment_ids = torch.from_numpy(np.array(entry["segment_ids"])) + entry["segment_ids"] = segment_ids + + def generate_random_name(self, det_names): + random_name = [] + for name in det_names: + if name == 'person': + word = random.choice(self._names) + else: + word = name + random_name.append(word) + + return random_name + + def replace_det_with_name(self, inputs, random_names): + tokens = [] + mask = [] + for w in inputs: + if isinstance(w, str): + word = w + det = -1 + word_token = self._tokenizer.tokenize(word) + mask += [det] * len(word_token) + tokens += word_token + else: + for idx in w: + word = random_names[idx] + word_token = self._tokenizer.tokenize(word) + mask += [idx] * len(word_token) + tokens += word_token + + return tokens, mask + + def _truncate_seq_pair(self, tokens_a, tokens_b, mask_a, mask_b, max_length): + """Truncates a sequence pair in place to the maximum length.""" + + # This is a simple heuristic which will always truncate the longer sequence + # one token at a time. This makes more sense than truncating an equal percent + # of tokens from each, since if one sequence is very short then each token + # that's truncated likely contains more information than a longer sequence. + while True: + total_length = len(tokens_a) + len(tokens_b) + if total_length <= max_length: + break + if len(tokens_a) > len(tokens_b): + tokens_a.pop() + mask_a.pop() + else: + tokens_b.pop() + mask_b.pop() + + def __getitem__(self, index): + + entry = self._entries[index] + + image_id = entry["img_id"] + features, num_boxes, boxes, _ = self._image_features_reader[image_id] + + boxes = boxes[:num_boxes] + features = features[:num_boxes] + + gt_features, gt_num_boxes, gt_boxes, _ = self._gt_image_features_reader[image_id] + + # merge two features. + features[0] = (features[0] * num_boxes + gt_features[0] * gt_num_boxes) / (num_boxes + gt_num_boxes) + + # merge two boxes, and assign the labels. + gt_boxes = gt_boxes[1:gt_num_boxes] + gt_features = gt_features[1:gt_num_boxes] + gt_num_boxes = gt_num_boxes - 1 + + gt_box_preserve = min(self._max_region_num-1, gt_num_boxes) + gt_boxes = gt_boxes[:gt_box_preserve] + gt_features = gt_features[:gt_box_preserve] + gt_num_boxes = gt_box_preserve + + num_box_preserve = min(self._max_region_num - int(gt_num_boxes), int(num_boxes)) + boxes = boxes[:num_box_preserve] + features = features[:num_box_preserve] + + # concatenate the boxes + mix_boxes = np.concatenate((boxes, gt_boxes), axis=0) + mix_features = np.concatenate((features, gt_features), axis=0) + mix_num_boxes = num_box_preserve + int(gt_num_boxes) + + image_mask = [1] * (mix_num_boxes) + while len(image_mask) < self._max_region_num: + image_mask.append(0) + + mix_boxes_pad = np.zeros((self._max_region_num, 5)) + mix_features_pad = np.zeros((self._max_region_num, 2048)) + + mix_boxes_pad[:mix_num_boxes] = mix_boxes[:mix_num_boxes] + mix_features_pad[:mix_num_boxes] = mix_features[:mix_num_boxes] + + # appending the target feature. + features = torch.tensor(mix_features_pad).float() + image_mask = torch.tensor(image_mask).long() + spatials = torch.tensor(mix_boxes_pad).float() + + input_ids = entry["input_ids"] + input_mask = entry["input_mask"] + segment_ids = entry["segment_ids"] + target = int(entry["target"]) + + if self._split == 'test': + # anno_id = entry["anno_id"] + anno_id = 0#entry["anno_id"] + else: + anno_id = entry["img_id"] + + co_attention_idxs = entry["co_attention_mask"] + co_attention_mask = torch.zeros((len(entry["co_attention_mask"]), self._max_region_num, self._max_caption_length)) + + for ii, co_attention_idx in enumerate(co_attention_idxs): + for jj, idx in enumerate(co_attention_idx): + if idx != -1 and idx+num_box_preserve < self._max_region_num: + co_attention_mask[ii, idx+num_box_preserve, jj] = 1 + + return features, spatials, image_mask, input_ids, target, input_mask, segment_ids, co_attention_mask, anno_id + + def __len__(self): + return len(self._entries) diff --git a/vilbert/datasets/vqa_dataset.py b/vilbert/datasets/vqa_dataset.py new file mode 100644 index 0000000..fe5c19a --- /dev/null +++ b/vilbert/datasets/vqa_dataset.py @@ -0,0 +1,223 @@ +import os +import json +import _pickle as cPickle +import logging + +import numpy as np +import torch +from torch.utils.data import Dataset +from pytorch_pretrained_bert.tokenization import BertTokenizer + +from ._image_features_reader import ImageFeaturesH5Reader +import pdb +logger = logging.getLogger(__name__) # pylint: disable=invalid-name +os.environ["HDF5_USE_FILE_LOCKING"] = "FALSE" + +def assert_eq(real, expected): + assert real == expected, "%s (true) vs %s (expected)" % (real, expected) + +def _create_entry(question, answer): + answer.pop("image_id") + answer.pop("question_id") + entry = { + "question_id": question["question_id"], + "image_id": question["image_id"], + "question": question["question"], + "answer": answer, + } + return entry + +def _load_dataset(dataroot, name): + """Load entries + + dataroot: root path of dataset + name: 'train', 'val', 'trainval', 'minsval' + """ + if name == 'train' or name == 'val': + question_path = os.path.join(dataroot, "v2_OpenEnded_mscoco_%s2014_questions.json" % name) + questions = sorted(json.load(open(question_path))["questions"], key=lambda x: x["question_id"]) + answer_path = os.path.join(dataroot, "cache", "%s_target.pkl" % name) + answers = cPickle.load(open(answer_path, "rb")) + answers = sorted(answers, key=lambda x: x["question_id"]) + + elif name == 'trainval': + question_path_train = os.path.join(dataroot, "v2_OpenEnded_mscoco_%s2014_questions.json" % 'train') + questions_train = sorted(json.load(open(question_path_train))["questions"], key=lambda x: x["question_id"]) + answer_path_train = os.path.join(dataroot, "cache", "%s_target.pkl" % 'train') + answers_train = cPickle.load(open(answer_path_train, "rb")) + answers_train = sorted(answers_train, key=lambda x: x["question_id"]) + + question_path_val = os.path.join(dataroot, "v2_OpenEnded_mscoco_%s2014_questions.json" % 'val') + questions_val = sorted(json.load(open(question_path_val))["questions"], key=lambda x: x["question_id"]) + answer_path_val = os.path.join(dataroot, "cache", "%s_target.pkl" % 'val') + answers_val = cPickle.load(open(answer_path_val, "rb")) + answers_val = sorted(answers_val, key=lambda x: x["question_id"]) + questions = questions_train + questions_val[:-3000] + answers = answers_train + answers_val[:-3000] + + elif name == 'minval': + question_path_val = os.path.join(dataroot, "v2_OpenEnded_mscoco_%s2014_questions.json" % 'val') + questions_val = sorted(json.load(open(question_path_val))["questions"], key=lambda x: x["question_id"]) + answer_path_val = os.path.join(dataroot, "cache", "%s_target.pkl" % 'val') + answers_val = cPickle.load(open(answer_path_val, "rb")) + answers_val = sorted(answers_val, key=lambda x: x["question_id"]) + questions = questions_val[-3000:] + answers = answers_val[-3000:] + + elif name == 'test': + question_path_test = os.path.join(dataroot, "v2_OpenEnded_mscoco_%s2015_questions.json" % 'test') + questions_test = sorted(json.load(open(question_path_test))["questions"], key=lambda x: x["question_id"]) + questions = questions_test + else: + assert False, "data split is not recognized." + + if 'test' in name: + entries = [] + for question in questions: + entries.append(question) + else: + assert_eq(len(questions), len(answers)) + entries = [] + for question, answer in zip(questions, answers): + assert_eq(question["question_id"], answer["question_id"]) + assert_eq(question["image_id"], answer["image_id"]) + entries.append(_create_entry(question, answer)) + return entries + +class VQAClassificationDataset(Dataset): + def __init__( + self, + task: str, + dataroot: str, + annotations_jsonpath: str, + split: str, + image_features_reader: ImageFeaturesH5Reader, + gt_image_features_reader: ImageFeaturesH5Reader, + tokenizer: BertTokenizer, + padding_index: int = 0, + max_seq_length: int = 16, + max_region_num: int = 37, + ): + super().__init__() + self.split = split + ans2label_path = os.path.join('data', task, "cache", "trainval_ans2label.pkl") + label2ans_path = os.path.join('data', task, "cache", "trainval_label2ans.pkl") + self.ans2label = cPickle.load(open(ans2label_path, "rb")) + self.label2ans = cPickle.load(open(label2ans_path, "rb")) + self.num_labels = len(self.ans2label) + self._max_region_num = max_region_num + self._max_seq_length = max_seq_length + self._image_features_reader = image_features_reader + self._tokenizer = tokenizer + self._padding_index = padding_index + cache_path = os.path.join('data', task, "cache", task + '_' + split + '_' + str(max_seq_length)+'.pkl') + if not os.path.exists(cache_path): + self.entries = _load_dataset(dataroot, split) + self.tokenize(max_seq_length) + self.tensorize() + cPickle.dump(self.entries, open(cache_path, 'wb')) + else: + logger.info("Loading from %s" %cache_path) + self.entries = cPickle.load(open(cache_path, "rb")) + + def tokenize(self, max_length=16): + """Tokenizes the questions. + + This will add q_token in each entry of the dataset. + -1 represent nil, and should be treated as padding_index in embedding + """ + for entry in self.entries: + tokens = self._tokenizer.tokenize(entry["question"]) + tokens = ["[CLS]"] + tokens + ["[SEP]"] + + tokens = [ + self._tokenizer.vocab.get(w, self._tokenizer.vocab["[UNK]"]) + for w in tokens + ] + + tokens = tokens[:max_length] + segment_ids = [0] * len(tokens) + input_mask = [1] * len(tokens) + + if len(tokens) < max_length: + # Note here we pad in front of the sentence + padding = [self._padding_index] * (max_length - len(tokens)) + tokens = tokens + padding + input_mask += padding + segment_ids += padding + + assert_eq(len(tokens), max_length) + entry["q_token"] = tokens + entry["q_input_mask"] = input_mask + entry["q_segment_ids"] = segment_ids + + def tensorize(self): + + for entry in self.entries: + question = torch.from_numpy(np.array(entry["q_token"])) + entry["q_token"] = question + + q_input_mask = torch.from_numpy(np.array(entry["q_input_mask"])) + entry["q_input_mask"] = q_input_mask + + q_segment_ids = torch.from_numpy(np.array(entry["q_segment_ids"])) + entry["q_segment_ids"] = q_segment_ids + + if 'test' not in self.split: + answer = entry["answer"] + labels = np.array(answer["labels"]) + scores = np.array(answer["scores"], dtype=np.float32) + if len(labels): + labels = torch.from_numpy(labels) + scores = torch.from_numpy(scores) + entry["answer"]["labels"] = labels + entry["answer"]["scores"] = scores + else: + entry["answer"]["labels"] = None + entry["answer"]["scores"] = None + + def __getitem__(self, index): + entry = self.entries[index] + image_id = entry["image_id"] + question_id = entry["question_id"] + features, num_boxes, boxes, _ = self._image_features_reader[image_id] + + mix_num_boxes = min(int(num_boxes), self._max_region_num) + mix_boxes_pad = np.zeros((self._max_region_num, 5)) + mix_features_pad = np.zeros((self._max_region_num, 2048)) + + image_mask = [1] * (int(mix_num_boxes)) + while len(image_mask) < self._max_region_num: + image_mask.append(0) + + # shuffle the image location here. + # img_idx = list(np.random.permutation(num_boxes-1)[:mix_num_boxes]+1) + # img_idx.append(0) + # mix_boxes_pad[:mix_num_boxes] = boxes[img_idx] + # mix_features_pad[:mix_num_boxes] = features[img_idx] + + mix_boxes_pad[:mix_num_boxes] = boxes[:mix_num_boxes] + mix_features_pad[:mix_num_boxes] = features[:mix_num_boxes] + + features = torch.tensor(mix_features_pad).float() + image_mask = torch.tensor(image_mask).long() + spatials = torch.tensor(mix_boxes_pad).float() + + question = entry["q_token"] + input_mask = entry["q_input_mask"] + segment_ids = entry["q_segment_ids"] + + co_attention_mask = torch.zeros((self._max_region_num, self._max_seq_length)) + target = torch.zeros(self.num_labels) + + if "test" not in self.split: + answer = entry["answer"] + labels = answer["labels"] + scores = answer["scores"] + if labels is not None: + target.scatter_(0, labels, scores) + + return features, spatials, image_mask, question, target, input_mask, segment_ids, co_attention_mask, question_id + + def __len__(self): + return len(self.entries) diff --git a/vilbert/optimization.py b/vilbert/optimization.py new file mode 100644 index 0000000..b0983cc --- /dev/null +++ b/vilbert/optimization.py @@ -0,0 +1,554 @@ +# coding=utf-8 +# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +"""PyTorch optimization for BERT model.""" + +import math +import torch +from torch.optim import Optimizer +from torch.optim.optimizer import required +from torch.nn.utils import clip_grad_norm_ +import logging +import abc +import sys + +logger = logging.getLogger(__name__) + + +if sys.version_info >= (3, 4): + ABC = abc.ABC +else: + ABC = abc.ABCMeta('ABC', (), {}) + + +class _LRSchedule(ABC): + """ Parent of all LRSchedules here. """ + warn_t_total = False # is set to True for schedules where progressing beyond t_total steps doesn't make sense + def __init__(self, warmup=0.002, t_total=-1, **kw): + """ + :param warmup: what fraction of t_total steps will be used for linear warmup + :param t_total: how many training steps (updates) are planned + :param kw: + """ + super(_LRSchedule, self).__init__(**kw) + if t_total < 0: + logger.warning("t_total value of {} results in schedule not being applied".format(t_total)) + if not 0.0 <= warmup < 1.0 and not warmup == -1: + raise ValueError("Invalid warmup: {} - should be in [0.0, 1.0[ or -1".format(warmup)) + warmup = max(warmup, 0.) + self.warmup, self.t_total = float(warmup), float(t_total) + self.warned_for_t_total_at_progress = -1 + + def get_lr(self, step, nowarn=False): + """ + :param step: which of t_total steps we're on + :param nowarn: set to True to suppress warning regarding training beyond specified 't_total' steps + :return: learning rate multiplier for current update + """ + if self.t_total < 0: + return 1. + progress = float(step) / self.t_total + ret = self.get_lr_(progress) + # warning for exceeding t_total (only active with warmup_linear + if not nowarn and self.warn_t_total and progress > 1. and progress > self.warned_for_t_total_at_progress: + logger.warning( + "Training beyond specified 't_total'. Learning rate multiplier set to {}. Please set 't_total' of {} correctly." + .format(ret, self.__class__.__name__)) + self.warned_for_t_total_at_progress = progress + # end warning + return ret + + @abc.abstractmethod + def get_lr_(self, progress): + """ + :param progress: value between 0 and 1 (unless going beyond t_total steps) specifying training progress + :return: learning rate multiplier for current update + """ + return 1. + + +class ConstantLR(_LRSchedule): + def get_lr_(self, progress): + return 1. + + +class WarmupCosineSchedule(_LRSchedule): + """ + Linearly increases learning rate from 0 to 1 over `warmup` fraction of training steps. + Decreases learning rate from 1. to 0. over remaining `1 - warmup` steps following a cosine curve. + If `cycles` (default=0.5) is different from default, learning rate follows cosine function after warmup. + """ + warn_t_total = True + def __init__(self, warmup=0.002, t_total=-1, cycles=.5, **kw): + """ + :param warmup: see LRSchedule + :param t_total: see LRSchedule + :param cycles: number of cycles. Default: 0.5, corresponding to cosine decay from 1. at progress==warmup and 0 at progress==1. + :param kw: + """ + super(WarmupCosineSchedule, self).__init__(warmup=warmup, t_total=t_total, **kw) + self.cycles = cycles + + def get_lr_(self, progress): + if progress < self.warmup: + return progress / self.warmup + else: + progress = (progress - self.warmup) / (1 - self.warmup) # progress after warmup + return 0.5 * (1. + math.cos(math.pi * self.cycles * 2 * progress)) + + +class WarmupCosineWithHardRestartsSchedule(WarmupCosineSchedule): + """ + Linearly increases learning rate from 0 to 1 over `warmup` fraction of training steps. + If `cycles` (default=1.) is different from default, learning rate follows `cycles` times a cosine decaying + learning rate (with hard restarts). + """ + def __init__(self, warmup=0.002, t_total=-1, cycles=1., **kw): + super(WarmupCosineWithHardRestartsSchedule, self).__init__(warmup=warmup, t_total=t_total, cycles=cycles, **kw) + assert(cycles >= 1.) + + def get_lr_(self, progress): + if progress < self.warmup: + return progress / self.warmup + else: + progress = (progress - self.warmup) / (1 - self.warmup) # progress after warmup + ret = 0.5 * (1. + math.cos(math.pi * ((self.cycles * progress) % 1))) + return ret + + +class WarmupCosineWithWarmupRestartsSchedule(WarmupCosineWithHardRestartsSchedule): + """ + All training progress is divided in `cycles` (default=1.) parts of equal length. + Every part follows a schedule with the first `warmup` fraction of the training steps linearly increasing from 0. to 1., + followed by a learning rate decreasing from 1. to 0. following a cosine curve. + """ + def __init__(self, warmup=0.002, t_total=-1, cycles=1., **kw): + assert(warmup * cycles < 1.) + warmup = warmup * cycles if warmup >= 0 else warmup + super(WarmupCosineWithWarmupRestartsSchedule, self).__init__(warmup=warmup, t_total=t_total, cycles=cycles, **kw) + + def get_lr_(self, progress): + progress = progress * self.cycles % 1. + if progress < self.warmup: + return progress / self.warmup + else: + progress = (progress - self.warmup) / (1 - self.warmup) # progress after warmup + ret = 0.5 * (1. + math.cos(math.pi * progress)) + return ret + + +class WarmupConstantSchedule(_LRSchedule): + """ + Linearly increases learning rate from 0 to 1 over `warmup` fraction of training steps. + Keeps learning rate equal to 1. after warmup. + """ + def get_lr_(self, progress): + if progress < self.warmup: + return progress / self.warmup + return 1. + + +class WarmupLinearSchedule(_LRSchedule): + """ + Linearly increases learning rate from 0 to 1 over `warmup` fraction of training steps. + Linearly decreases learning rate from 1. to 0. over remaining `1 - warmup` steps. + """ + warn_t_total = True + def get_lr_(self, progress): + if progress < self.warmup: + return progress / self.warmup + return max((progress - 1.) / (self.warmup - 1.), 0.) + + +SCHEDULES = { + None: ConstantLR, + "none": ConstantLR, + "warmup_cosine": WarmupCosineSchedule, + "warmup_constant": WarmupConstantSchedule, + "warmup_linear": WarmupLinearSchedule +} + + +class BertAdam(Optimizer): + """Implements BERT version of Adam algorithm with weight decay fix. + Params: + lr: learning rate + warmup: portion of t_total for the warmup, -1 means no warmup. Default: -1 + t_total: total number of training steps for the learning + rate schedule, -1 means constant learning rate of 1. (no warmup regardless of warmup setting). Default: -1 + schedule: schedule to use for the warmup (see above). + Can be `'warmup_linear'`, `'warmup_constant'`, `'warmup_cosine'`, `'none'`, `None` or a `_LRSchedule` object (see below). + If `None` or `'none'`, learning rate is always kept constant. + Default : `'warmup_linear'` + b1: Adams b1. Default: 0.9 + b2: Adams b2. Default: 0.999 + e: Adams epsilon. Default: 1e-6 + weight_decay: Weight decay. Default: 0.01 + max_grad_norm: Maximum norm for the gradients (-1 means no clipping). Default: 1.0 + """ + def __init__(self, params, lr=required, warmup=-1, t_total=-1, schedule='warmup_linear', + b1=0.9, b2=0.999, e=1e-6, weight_decay=0.01, max_grad_norm=1.0, **kwargs): + if lr is not required and lr < 0.0: + raise ValueError("Invalid learning rate: {} - should be >= 0.0".format(lr)) + if not isinstance(schedule, _LRSchedule) and schedule not in SCHEDULES: + raise ValueError("Invalid schedule parameter: {}".format(schedule)) + if not 0.0 <= b1 < 1.0: + raise ValueError("Invalid b1 parameter: {} - should be in [0.0, 1.0[".format(b1)) + if not 0.0 <= b2 < 1.0: + raise ValueError("Invalid b2 parameter: {} - should be in [0.0, 1.0[".format(b2)) + if not e >= 0.0: + raise ValueError("Invalid epsilon value: {} - should be >= 0.0".format(e)) + # initialize schedule object + if not isinstance(schedule, _LRSchedule): + schedule_type = SCHEDULES[schedule] + schedule = schedule_type(warmup=warmup, t_total=t_total) + else: + if warmup != -1 or t_total != -1: + logger.warning("warmup and t_total on the optimizer are ineffective when _LRSchedule object is provided as schedule. " + "Please specify custom warmup and t_total in _LRSchedule object.") + defaults = dict(lr=lr, schedule=schedule, + b1=b1, b2=b2, e=e, weight_decay=weight_decay, + max_grad_norm=max_grad_norm) + self.rate = None + super(BertAdam, self).__init__(params, defaults) + + def show_lr(self): + return self.rate + + def get_lr(self): + lr = [] + for group in self.param_groups: + for p in group['params']: + state = self.state[p] + if len(state) == 0: + return [0] + lr_scheduled = group['lr'] + lr_scheduled *= group['schedule'].get_lr(state['step']) + lr.append(lr_scheduled) + return lr + + def step(self, closure=None): + """Performs a single optimization step. + Arguments: + closure (callable, optional): A closure that reevaluates the model + and returns the loss. + """ + loss = None + if closure is not None: + loss = closure() + + for group in self.param_groups: + for p in group['params']: + if p.grad is None: + continue + grad = p.grad.data + if grad.is_sparse: + raise RuntimeError('Adam does not support sparse gradients, please consider SparseAdam instead') + + state = self.state[p] + + # State initialization + if len(state) == 0: + state['step'] = 0 + # Exponential moving average of gradient values + state['next_m'] = torch.zeros_like(p.data) + # Exponential moving average of squared gradient values + state['next_v'] = torch.zeros_like(p.data) + + next_m, next_v = state['next_m'], state['next_v'] + beta1, beta2 = group['b1'], group['b2'] + + # Add grad clipping + if group['max_grad_norm'] > 0: + clip_grad_norm_(p, group['max_grad_norm']) + + # Decay the first and second moment running average coefficient + # In-place operations to update the averages at the same time + next_m.mul_(beta1).add_(1 - beta1, grad) + next_v.mul_(beta2).addcmul_(1 - beta2, grad, grad) + update = next_m / (next_v.sqrt() + group['e']) + + # Just adding the square of the weights to the loss function is *not* + # the correct way of using L2 regularization/weight decay with Adam, + # since that will interact with the m and v parameters in strange ways. + # + # Instead we want to decay the weights in a manner that doesn't interact + # with the m/v parameters. This is equivalent to adding the square + # of the weights to the loss with plain (non-momentum) SGD. + if group['weight_decay'] > 0.0: + update += group['weight_decay'] * p.data + + lr_scheduled = group['lr'] + lr_scheduled *= group['schedule'].get_lr(state['step']) + + self.rate = lr_scheduled + + update_with_lr = lr_scheduled * update + p.data.add_(-update_with_lr) + + state['step'] += 1 + + # step_size = lr_scheduled * math.sqrt(bias_correction2) / bias_correction1 + # No bias correction + # bias_correction1 = 1 - beta1 ** state['step'] + # bias_correction2 = 1 - beta2 ** state['step'] + + return loss + + +class Adam(Optimizer): + """Implements pytorch version of Adam algorithm with weight decay fix. + Params: + lr: learning rate + warmup: portion of t_total for the warmup, -1 means no warmup. Default: -1 + t_total: total number of training steps for the learning + rate schedule, -1 means constant learning rate of 1. (no warmup regardless of warmup setting). Default: -1 + schedule: schedule to use for the warmup (see above). + Can be `'warmup_linear'`, `'warmup_constant'`, `'warmup_cosine'`, `'none'`, `None` or a `_LRSchedule` object (see below). + If `None` or `'none'`, learning rate is always kept constant. + Default : `'warmup_linear'` + b1: Adams b1. Default: 0.9 + b2: Adams b2. Default: 0.999 + e: Adams epsilon. Default: 1e-6 + weight_decay: Weight decay. Default: 0.01 + max_grad_norm: Maximum norm for the gradients (-1 means no clipping). Default: 1.0 + """ + def __init__(self, params, lr=required, warmup=-1, t_total=-1, schedule='warmup_linear', + b1=0.9, b2=0.999, e=1e-8, weight_decay=0, amsgrad=False, max_grad_norm=1.0, **kwargs): + if lr is not required and lr < 0.0: + raise ValueError("Invalid learning rate: {} - should be >= 0.0".format(lr)) + if not isinstance(schedule, _LRSchedule) and schedule not in SCHEDULES: + raise ValueError("Invalid schedule parameter: {}".format(schedule)) + if not 0.0 <= b1 < 1.0: + raise ValueError("Invalid b1 parameter: {} - should be in [0.0, 1.0[".format(b1)) + if not 0.0 <= b2 < 1.0: + raise ValueError("Invalid b2 parameter: {} - should be in [0.0, 1.0[".format(b2)) + if not e >= 0.0: + raise ValueError("Invalid epsilon value: {} - should be >= 0.0".format(e)) + # initialize schedule object + if not isinstance(schedule, _LRSchedule): + schedule_type = SCHEDULES[schedule] + schedule = schedule_type(warmup=warmup, t_total=t_total) + else: + if warmup != -1 or t_total != -1: + logger.warning("warmup and t_total on the optimizer are ineffective when _LRSchedule object is provided as schedule. " + "Please specify custom warmup and t_total in _LRSchedule object.") + defaults = dict(lr=lr, schedule=schedule, + b1=b1, b2=b2, e=e, weight_decay=weight_decay, + amsgrad=amsgrad, max_grad_norm=max_grad_norm) + self.rate = None + super(Adam, self).__init__(params, defaults) + + def __setstate__(self, state): + super(Adam, self).__setstate__(state) + for group in self.param_groups: + group.setdefault('amsgrad', False) + + def show_lr(self): + return self.rate + + def get_lr(self): + lr = [] + for group in self.param_groups: + for p in group['params']: + state = self.state[p] + if len(state) == 0: + return [0] + lr_scheduled = group['lr'] + lr_scheduled *= group['schedule'].get_lr(state['step']) + lr.append(lr_scheduled) + return lr + + def step(self, closure=None): + """Performs a single optimization step. + Arguments: + closure (callable, optional): A closure that reevaluates the model + and returns the loss. + """ + loss = None + if closure is not None: + loss = closure() + + for group in self.param_groups: + for p in group['params']: + if p.grad is None: + continue + grad = p.grad.data + if grad.is_sparse: + raise RuntimeError('Adam does not support sparse gradients, please consider SparseAdam instead') + amsgrad = group['amsgrad'] + + state = self.state[p] + + # State initialization + if len(state) == 0: + state['step'] = 0 + # Exponential moving average of gradient values + state['exp_avg'] = torch.zeros_like(p.data) + # Exponential moving average of squared gradient values + state['exp_avg_sq'] = torch.zeros_like(p.data) + if amsgrad: + # Maintains max of all exp. moving avg. of sq. grad. values + state['max_exp_avg_sq'] = torch.zeros_like(p.data) + + exp_avg, exp_avg_sq = state['exp_avg'], state['exp_avg_sq'] + if amsgrad: + max_exp_avg_sq = state['max_exp_avg_sq'] + beta1, beta2 = group['b1'], group['b2'] + + # Add grad clipping + if group['max_grad_norm'] > 0: + clip_grad_norm_(p, group['max_grad_norm']) + + lr_scheduled = group['lr'] + lr_scheduled *= group['schedule'].get_lr(state['step']) + self.rate = lr_scheduled + + state['step'] += 1 + + if group['weight_decay'] != 0: + grad.add_(group['weight_decay'], p.data) + + # Decay the first and second moment running average coefficient + exp_avg.mul_(beta1).add_(1 - beta1, grad) + exp_avg_sq.mul_(beta2).addcmul_(1 - beta2, grad, grad) + if amsgrad: + # Maintains the maximum of all 2nd moment running avg. till now + torch.max(max_exp_avg_sq, exp_avg_sq, out=max_exp_avg_sq) + # Use the max. for normalizing running avg. of gradient + denom = max_exp_avg_sq.sqrt().add_(group['e']) + else: + denom = exp_avg_sq.sqrt().add_(group['e']) + + bias_correction1 = 1 - beta1 ** state['step'] + bias_correction2 = 1 - beta2 ** state['step'] + step_size = lr_scheduled * math.sqrt(bias_correction2) / bias_correction1 + + p.data.addcdiv_(-step_size, exp_avg, denom) + + return loss + +class Adamax(Optimizer): + """Implements Adamax algorithm (a variant of Adam based on infinity norm). + It has been proposed in `Adam: A Method for Stochastic Optimization`__. + Arguments: + params (iterable): iterable of parameters to optimize or dicts defining + parameter groups + lr (float, optional): learning rate (default: 2e-3) + betas (Tuple[float, float], optional): coefficients used for computing + running averages of gradient and its square + eps (float, optional): term added to the denominator to improve + numerical stability (default: 1e-8) + weight_decay (float, optional): weight decay (L2 penalty) (default: 0) + __ https://arxiv.org/abs/1412.6980 + """ + + def __init__(self, params, lr=required, warmup=-1, t_total=-1, schedule='warmup_linear', + b1=0.9, b2=0.999, e=1e-8, weight_decay=0, max_grad_norm=1.0, **kwargs): + if not 0.0 <= lr: + raise ValueError("Invalid learning rate: {}".format(lr)) + if not 0.0 <= e: + raise ValueError("Invalid epsilon value: {}".format(e)) + if not 0.0 <= b1 < 1.0: + raise ValueError("Invalid b1 parameter: {} - should be in [0.0, 1.0[".format(b1)) + if not 0.0 <= b2 < 1.0: + raise ValueError("Invalid b2 parameter: {} - should be in [0.0, 1.0[".format(b2)) + if not 0.0 <= weight_decay: + raise ValueError("Invalid weight_decay value: {}".format(weight_decay)) + + if not isinstance(schedule, _LRSchedule): + schedule_type = SCHEDULES[schedule] + schedule = schedule_type(warmup=warmup, t_total=t_total) + else: + if warmup != -1 or t_total != -1: + logger.warning("warmup and t_total on the optimizer are ineffective when _LRSchedule object is provided as schedule. " + "Please specify custom warmup and t_total in _LRSchedule object.") + + defaults = dict(lr=lr, schedule=schedule, + b1=b1, b2=b2, e=e, weight_decay=weight_decay, max_grad_norm=max_grad_norm) + self.rate = None + super(Adamax, self).__init__(params, defaults) + + def show_lr(self): + return self.rate + + def get_lr(self): + lr = [] + for group in self.param_groups: + for p in group['params']: + state = self.state[p] + if len(state) == 0: + return [0] + lr_scheduled = group['lr'] + lr_scheduled *= group['schedule'].get_lr(state['step']) + lr.append(lr_scheduled) + return lr + + def step(self, closure=None): + """Performs a single optimization step. + Arguments: + closure (callable, optional): A closure that reevaluates the model + and returns the loss. + """ + loss = None + if closure is not None: + loss = closure() + + for group in self.param_groups: + for p in group['params']: + if p.grad is None: + continue + grad = p.grad.data + if grad.is_sparse: + raise RuntimeError('Adamax does not support sparse gradients') + state = self.state[p] + + # State initialization + if len(state) == 0: + state['step'] = 0 + state['exp_avg'] = torch.zeros_like(p.data) + state['exp_inf'] = torch.zeros_like(p.data) + + exp_avg, exp_inf = state['exp_avg'], state['exp_inf'] + beta1, beta2 = group['b1'], group['b2'] + eps = group['e'] + + # Add grad clipping + if group['max_grad_norm'] > 0: + clip_grad_norm_(p, group['max_grad_norm']) + + lr_scheduled = group['lr'] + lr_scheduled *= group['schedule'].get_lr(state['step']) + + self.rate = lr_scheduled + state['step'] += 1 + + if group['weight_decay'] != 0: + grad = grad.add(group['weight_decay'], p.data) + + # Update biased first moment estimate. + exp_avg.mul_(beta1).add_(1 - beta1, grad) + # Update the exponentially weighted infinity norm. + norm_buf = torch.cat([ + exp_inf.mul_(beta2).unsqueeze(0), + grad.abs().add_(eps).unsqueeze_(0) + ], 0) + torch.max(norm_buf, 0, keepdim=False, out=(exp_inf, exp_inf.new().long())) + + bias_correction = 1 - beta1 ** state['step'] + clr = lr_scheduled / bias_correction + + p.data.addcdiv_(-clr, exp_avg, exp_inf) + + return loss \ No newline at end of file diff --git a/vilbert/task_utils.py b/vilbert/task_utils.py new file mode 100644 index 0000000..27ec9e6 --- /dev/null +++ b/vilbert/task_utils.py @@ -0,0 +1,406 @@ +from io import open +import json +import logging +import os +import sys + +import torch +import torch.nn.functional as F +import torch.nn as nn +import torch.distributed as dist +from torch.utils.data import DataLoader, Dataset, RandomSampler +from torch.utils.data.distributed import DistributedSampler +from pytorch_pretrained_bert.tokenization import BertTokenizer +from vilbert.datasets import DatasetMapTrain, DatasetMapEval +from vilbert.datasets._image_features_reader import ImageFeaturesH5Reader +import pdb + +logger = logging.getLogger(__name__) # pylint: disable=invalid-name + +LossMap = {'BCEWithLogitLoss': nn.BCEWithLogitsLoss(reduction='mean'), + 'CrossEntropyLoss': nn.CrossEntropyLoss(), + } + +def ForwardModelsVal(args, task_cfg, device, task_id, batch, model, task_losses): + batch = tuple(t.cuda(device=device, non_blocking=True) for t in batch) + features, spatials, image_mask, question, target, input_mask, segment_ids, co_attention_mask, question_id = batch + batch_size = features.size(0) + + if task_id in ['TASK2', 'TASK3', 'TASK5', 'TASK6', 'TASK7']: + max_num_bbox = features.size(1) + num_options = question.size(1) + features = features.unsqueeze(1).expand(batch_size, num_options, max_num_bbox, 2048).contiguous().view(-1, max_num_bbox, 2048) + spatials = spatials.unsqueeze(1).expand(batch_size, num_options, max_num_bbox, 5).contiguous().view(-1, max_num_bbox, 5) + image_mask = image_mask.unsqueeze(1).expand(batch_size, num_options, max_num_bbox).contiguous().view(-1, max_num_bbox) + question = question.view(-1, question.size(2)) + input_mask = input_mask.view(-1, input_mask.size(2)) + segment_ids = segment_ids.view(-1, segment_ids.size(2)) + co_attention_mask = co_attention_mask.view(-1, co_attention_mask.size(2), co_attention_mask.size(3)) + + elif task_id in ['TASK8', 'TASK9']: + batch_size = features.size(0) + max_num_bbox = features.size(1) + num_options = question.size(1) + features = features.view(-1, features.size(2), features.size(3)) + spatials = spatials.view(-1, spatials.size(2), spatials.size(3)) + image_mask = image_mask.view(-1, image_mask.size(2)) + question = question.view(-1, question.size(2)) + input_mask = input_mask.view(-1, input_mask.size(2)) + segment_ids = segment_ids.view(-1, segment_ids.size(2)) + co_attention_mask = co_attention_mask.view(-1, co_attention_mask.size(2), co_attention_mask.size(3)) + + vil_prediction, vil_logit, vil_binary_prediction, vision_prediction, vision_logit, linguisic_prediction, linguisic_logit = \ + model(question, features, spatials, segment_ids, input_mask, image_mask, co_attention_mask) + + if task_cfg[task_id]['type'] == 'VL-classifier': + loss = task_losses[task_id](vil_prediction, target) + loss = loss.mean() * target.size(1) + batch_score = compute_score_with_logits(vil_prediction, target).sum() + + elif task_cfg[task_id]['type'] == 'VL-logit': + vil_logit = vil_logit.view(batch_size, num_options) + loss = task_losses[task_id](vil_logit, target) + _, preds = torch.max(vil_logit, 1) + batch_score = (preds == target).sum() + + elif task_cfg[task_id]['type'] == 'V-logit': + loss = task_losses[task_id](vision_logit, target) + loss = loss.mean() * target.size(1) + _, select_idx = torch.max(vision_logit, dim=1) + select_target = target.squeeze(2).gather(1, select_idx.view(-1,1)) + batch_score = torch.sum(select_target>0.5).item() + + return float(loss), float(batch_score), batch_size + +def ForwardModelsTrain(args, task_cfg, device, task_id, task_count, task_iter_train, task_dataloader_train, model, task_losses, task_start_iter): + # given the current task, decided whether to forward the model and forward with specific loss. + + # reset the task iteration when needed. + if task_count[task_id] % len(task_dataloader_train[task_id]) == 0: + task_iter_train[task_id] = iter(task_dataloader_train[task_id]) + + task_count[task_id] += 1 + # get the batch + batch = task_iter_train[task_id].next() + batch = tuple(t.cuda(device=device, non_blocking=True) for t in batch) + features, spatials, image_mask, question, target, input_mask, segment_ids, co_attention_mask, question_id = batch + batch_size = features.size(0) + + if task_id in ['TASK2', 'TASK3', 'TASK5', 'TASK6', 'TASK7']: + max_num_bbox = features.size(1) + num_options = question.size(1) + features = features.unsqueeze(1).expand(batch_size, num_options, max_num_bbox, 2048).contiguous().view(-1, max_num_bbox, 2048) + spatials = spatials.unsqueeze(1).expand(batch_size, num_options, max_num_bbox, 5).contiguous().view(-1, max_num_bbox, 5) + image_mask = image_mask.unsqueeze(1).expand(batch_size, num_options, max_num_bbox).contiguous().view(-1, max_num_bbox) + question = question.view(-1, question.size(2)) + input_mask = input_mask.view(-1, input_mask.size(2)) + segment_ids = segment_ids.view(-1, segment_ids.size(2)) + co_attention_mask = co_attention_mask.view(-1, co_attention_mask.size(2), co_attention_mask.size(3)) + + elif task_id in ['TASK8', 'TASK9']: + max_num_bbox = features.size(1) + num_options = question.size(1) + features = features.view(-1, features.size(2), features.size(3)) + spatials = spatials.view(-1, spatials.size(2), spatials.size(3)) + image_mask = image_mask.view(-1, image_mask.size(2)) + question = question.view(-1, question.size(2)) + input_mask = input_mask.view(-1, input_mask.size(2)) + segment_ids = segment_ids.view(-1, segment_ids.size(2)) + co_attention_mask = co_attention_mask.view(-1, co_attention_mask.size(2), co_attention_mask.size(3)) + + # get the model output + vil_prediction, vil_logit, vil_binary_prediction, vision_prediction, vision_logit, linguisic_prediction, linguisic_logit = \ + model(question, features, spatials, segment_ids, input_mask, image_mask, co_attention_mask) + + # for different task, we use different output to calculate the loss. + if task_cfg[task_id]['type'] == 'VL-classifier': + loss = task_losses[task_id](vil_prediction, target) + loss = loss.mean() * target.size(1) + batch_score = compute_score_with_logits(vil_prediction, target).sum() / float(batch_size) + + elif task_cfg[task_id]['type'] == 'VL-logit': + vil_logit = vil_logit.view(batch_size, num_options) + loss = task_losses[task_id](vil_logit, target) + _, preds = torch.max(vil_logit, 1) + batch_score = float((preds == target).sum()) / float(batch_size) + loss = task_losses[task_id](vil_logit, target) + + elif task_cfg[task_id]['type'] == 'V-logit': + loss = task_losses[task_id](vision_logit, target) + loss = loss.mean() * target.size(1) + _, select_idx = torch.max(vision_logit, dim=1) + select_target = target.squeeze(2).gather(1, select_idx.view(-1,1)) + batch_score = float(torch.sum(select_target>0.5)) / batch_size + + return loss, batch_score + + +def LoadLosses(args, task_cfg, task_ids): + + losses = {} + task_types = [] + num_labels = 0 + for i, task_id in enumerate(task_ids): + task = 'TASK' + task_id + model_type = task_cfg[task]['type'] + if model_type not in task_types: + task_types.append(model_type) + losses[task] = LossMap[task_cfg[task]['loss']] + + return losses + +def LoadDatasets(args, task_cfg, ids, split='trainval'): + + tokenizer = BertTokenizer.from_pretrained( + args.bert_model, do_lower_case=True + ) + + task_feature_reader1 = {} + task_feature_reader2 = {} + for i, task_id in enumerate(ids): + task = 'TASK' + task_id + if task_cfg[task]['features_h5path1'] not in task_feature_reader1: + task_feature_reader1[task_cfg[task]['features_h5path1']] = None + if task_cfg[task]['features_h5path2'] not in task_feature_reader2: + task_feature_reader2[task_cfg[task]['features_h5path2']] = None + + # initilzie the feature reader + for features_h5path in task_feature_reader1.keys(): + if features_h5path != '': + task_feature_reader1[features_h5path] = ImageFeaturesH5Reader(features_h5path, + args.in_memory) + + for features_h5path in task_feature_reader2.keys(): + if features_h5path != '': + task_feature_reader2[features_h5path] = ImageFeaturesH5Reader(features_h5path, args.in_memory) + + task_datasets_train = {} + task_datasets_val = {} + task_dataloader_train = {} + task_dataloader_val = {} + task_ids = [] + task_batch_size = {} + task_num_iters = {} + + for i, task_id in enumerate(ids): + task = 'TASK' + task_id + task_ids.append(task) + batch_size = task_cfg[task]['batch_size'] // args.gradient_accumulation_steps + num_workers = args.num_workers + if args.local_rank != -1: + batch_size = int(batch_size / dist.get_world_size()) + num_workers = int(num_workers / dist.get_world_size()) + + # num_workers = int(num_workers / len(ids)) + logger.info("Loading %s Dataset with batch size %d" %(task_cfg[task]['name'], batch_size)) + + task_datasets_train[task] = None + if 'train' in split: + task_datasets_train[task] = DatasetMapTrain[task]( + task=task_cfg[task]['name'], + dataroot=task_cfg[task]['dataroot'], + annotations_jsonpath=task_cfg[task]['train_annotations_jsonpath'], + split=task_cfg[task]['train_split'], + image_features_reader= task_feature_reader1[task_cfg[task]['features_h5path1']], + gt_image_features_reader= task_feature_reader2[task_cfg[task]['features_h5path2']], + tokenizer=tokenizer, + padding_index=0, + max_seq_length=task_cfg[task]['max_seq_length'], + max_region_num=task_cfg[task]['max_region_num'], + ) + + task_datasets_val[task] = None + if 'val' in split: + task_datasets_val[task] = DatasetMapTrain[task]( + task=task_cfg[task]['name'], + dataroot=task_cfg[task]['dataroot'], + annotations_jsonpath=task_cfg[task]['val_annotations_jsonpath'], + split=task_cfg[task]['val_split'], + image_features_reader= task_feature_reader1[task_cfg[task]['features_h5path1']], + gt_image_features_reader= task_feature_reader2[task_cfg[task]['features_h5path2']], + tokenizer=tokenizer, + padding_index=0, + max_seq_length=task_cfg[task]['max_seq_length'], + max_region_num=task_cfg[task]['max_region_num']) + + task_num_iters[task] = 0 + task_batch_size[task] = 0 + if 'train' in split: + if args.local_rank == -1: + train_sampler = RandomSampler(task_datasets_train[task]) + else: + #TODO: check if this works with current data generator from disk that relies on next(file) + # (it doesn't return item back by index) + train_sampler = DistributedSampler(task_datasets_train[task]) + + # num_workers = 1 + task_dataloader_train[task] = DataLoader( + task_datasets_train[task], + sampler=train_sampler, + # shuffle=False, + batch_size=batch_size, + num_workers=num_workers, + pin_memory=True, + ) + task_num_iters[task] = len(task_dataloader_train[task]) + task_batch_size[task] = batch_size + + if 'val' in split: + task_dataloader_val[task] = DataLoader( + task_datasets_val[task], + shuffle=False, + batch_size=batch_size, + num_workers=num_workers, + pin_memory=True, + ) + + return task_batch_size, task_num_iters, task_ids, task_datasets_train, task_datasets_val, task_dataloader_train, task_dataloader_val + + +def LoadDatasetEval(args, task_cfg, ids): + + tokenizer = BertTokenizer.from_pretrained( + args.bert_model, do_lower_case=True + ) + + task_feature_reader1 = {} + task_feature_reader2 = {} + for i, task_id in enumerate(ids): + task = 'TASK' + task_id + if task_cfg[task]['features_h5path1'] not in task_feature_reader1: + task_feature_reader1[task_cfg[task]['features_h5path1']] = None + if task_cfg[task]['features_h5path2'] not in task_feature_reader2: + task_feature_reader2[task_cfg[task]['features_h5path2']] = None + + # initilzie the feature reader + for features_h5path in task_feature_reader1.keys(): + if features_h5path != '': + task_feature_reader1[features_h5path] = ImageFeaturesH5Reader(features_h5path, + args.in_memory) + + for features_h5path in task_feature_reader2.keys(): + if features_h5path != '': + task_feature_reader2[features_h5path] = ImageFeaturesH5Reader(features_h5path, args.in_memory) + + task_datasets_val = {} + task_dataloader_val = {} + task_ids = [] + task_batch_size = {} + task_num_iters = {} + + for i, task_id in enumerate(ids): + task = 'TASK' + task_id + task_ids.append(task) + batch_size = args.batch_size + if args.local_rank != -1: + batch_size = int(batch_size / dist.get_world_size()) + + num_workers = int(args.num_workers / len(ids)) + logger.info("Loading %s Dataset with batch size %d" %(task_cfg[task]['name'], batch_size)) + + if args.split: + eval_split = args.split + else: + eval_split = task_cfg[task]['val_split'] + + task_datasets_val[task] = DatasetMapEval[task]( + task=task_cfg[task]['name'], + dataroot=task_cfg[task]['dataroot'], + annotations_jsonpath=task_cfg[task]['val_annotations_jsonpath'], + split=eval_split, + image_features_reader= task_feature_reader1[task_cfg[task]['features_h5path1']], + gt_image_features_reader= task_feature_reader2[task_cfg[task]['features_h5path2']], + tokenizer=tokenizer, + padding_index=0, + max_seq_length=task_cfg[task]['max_seq_length'], + max_region_num=task_cfg[task]['max_region_num']) + + task_dataloader_val[task] = DataLoader( + task_datasets_val[task], + shuffle=False, + batch_size=batch_size, + num_workers=10, + pin_memory=True, + ) + + task_num_iters[task] = len(task_dataloader_val[task]) + task_batch_size[task] = batch_size + + return task_batch_size, task_num_iters, task_ids, task_datasets_val, task_dataloader_val + + +def compute_score_with_logits(logits, labels): + logits = torch.max(logits, 1)[1].data # argmax + one_hots = torch.zeros(*labels.size()).cuda() + one_hots.scatter_(1, logits.view(-1, 1), 1) + scores = one_hots * labels + return scores + +def EvaluatingModel(args, task_cfg, device, task_id, batch, model, task_dataloader, task_losses, results, others): + batch = tuple(t.cuda(device=device, non_blocking=True) for t in batch) + features, spatials, image_mask, question, target, input_mask, segment_ids, co_attention_mask, question_id = batch + batch_size = features.size(0) + + if task_id in ['TASK2', 'TASK6', 'TASK7']: + max_num_bbox = features.size(1) + num_options = question.size(1) + features = features.unsqueeze(1).expand(batch_size, num_options, max_num_bbox, 2048).contiguous().view(-1, max_num_bbox, 2048) + spatials = spatials.unsqueeze(1).expand(batch_size, num_options, max_num_bbox, 5).contiguous().view(-1, max_num_bbox, 5) + image_mask = image_mask.unsqueeze(1).expand(batch_size, num_options, max_num_bbox).contiguous().view(-1, max_num_bbox) + question = question.view(-1, question.size(2)) + input_mask = input_mask.view(-1, input_mask.size(2)) + segment_ids = segment_ids.view(-1, segment_ids.size(2)) + co_attention_mask = co_attention_mask.view(-1, co_attention_mask.size(2), co_attention_mask.size(3)) + + elif task_id in ['TASK8', 'TASK9']: + batch_size = features.size(0) + max_num_bbox = features.size(1) + num_options = question.size(1) + features = features.view(-1, features.size(2), features.size(3)) + spatials = spatials.view(-1, spatials.size(2), spatials.size(3)) + image_mask = image_mask.view(-1, image_mask.size(2)) + question = question.view(-1, question.size(2)) + input_mask = input_mask.view(-1, input_mask.size(2)) + segment_ids = segment_ids.view(-1, segment_ids.size(2)) + co_attention_mask = co_attention_mask.view(-1, co_attention_mask.size(2), co_attention_mask.size(3)) + + with torch.no_grad(): + vil_prediction, vil_logit, vil_binary_prediction, vision_prediction, vision_logit, linguisic_prediction, linguisic_logit \ + = model(question, features, spatials, segment_ids, input_mask, image_mask, co_attention_mask) + + if task_cfg[task_id]['type'] == 'VL-classifier': + logits = torch.max(vil_prediction, 1)[1].data # argmax + sorted_score, sorted_idx = torch.sort(-vil_prediction) + topk = 8 # top candidate. + topkInd = sorted_idx[:,:topk] + loss = 0 + batch_score = 0 + for i in range(logits.size(0)): + results.append({'question_id':question_id[i].item(), \ + 'answer':task_dataloader[task_id].dataset.label2ans[logits[i].item()]}) + + # save top 8 as options. + others.append({'question_id':question_id[i].item(), \ + 'answer':[task_dataloader[task_id].dataset.label2ans[idx.item()] for idx in topkInd[i]]}) + + elif task_cfg[task_id]['type'] == 'VL-logit': + vil_logit = vil_logit.view(batch_size, num_options) + loss = task_losses[task_id](vil_logit, target) + _, preds = torch.max(vil_logit, 1) + batch_score = (preds == target).sum() + + probs = torch.softmax(vil_logit, dim=1) + for i in range(vil_logit.size(0)): + results.append({'question_id':question_id[i].item(), 'answer':[prob.item() for prob in probs[i]]}) + + elif task_cfg[task_id]['type'] == 'V-logit': + loss = task_losses[task_id](vision_logit, target) + loss = loss.mean() * target.size(1) + _, select_idx = torch.max(vision_logit, dim=1) + select_target = target.squeeze(2).gather(1, select_idx.view(-1,1)) + batch_score = torch.sum(select_target>0.5).item() + + for i in range(select_idx.size(0)): + results.append({'id':question_id[i].item(), 'target':select_idx[i].item(), 'IOU': select_target[i].item()}) + + return float(loss), float(batch_score), batch_size, results, others \ No newline at end of file diff --git a/vilbert/utils.py b/vilbert/utils.py new file mode 100644 index 0000000..a2ba465 --- /dev/null +++ b/vilbert/utils.py @@ -0,0 +1,346 @@ +""" +""" +from io import open +import json +import logging +from functools import wraps +from hashlib import sha256 +from pathlib import Path +import os +import shutil +import sys +import tempfile +from urllib.parse import urlparse + +import boto3 +import requests +from botocore.exceptions import ClientError +from tqdm import tqdm +from tensorboardX import SummaryWriter +from time import gmtime, strftime +from bisect import bisect + +import pdb + +PYTORCH_PRETRAINED_BERT_CACHE = Path( + os.getenv("PYTORCH_PRETRAINED_BERT_CACHE", Path.home() / ".pytorch_pretrained_bert") +) + + +logger = logging.getLogger(__name__) # pylint: disable=invalid-name + +def lr_warmup(step, ): + if ( + cfg["training_parameters"]["use_warmup"] is True + and i_iter <= cfg["training_parameters"]["warmup_iterations"] + ): + alpha = float(i_iter) / float(cfg["training_parameters"]["warmup_iterations"]) + return cfg["training_parameters"]["warmup_factor"] * (1.0 - alpha) + alpha + else: + idx = bisect(cfg["training_parameters"]["lr_steps"], i_iter) + return pow(cfg["training_parameters"]["lr_ratio"], idx) + +class tbLogger(object): + def __init__(self, log_dir, txt_dir, task_names, task_ids, task_num_iters, gradient_accumulation_steps, save_logger=True, txt_name='out.txt'): + logger.info("logging file at: " + log_dir) + + self.save_logger=save_logger + if self.save_logger: + self.logger = SummaryWriter(log_dir=log_dir) + + self.txt_f = open(txt_dir + '/' + txt_name, 'w') + self.task_id2name = {ids:name.replace('+', 'plus') for ids, name in zip(task_ids, task_names)} + self.task_ids = task_ids + self.task_loss = {task_id:0 for task_id in task_ids} + self.task_loss_tmp = {task_id:0 for task_id in task_ids} + self.task_score_tmp = {task_id:0 for task_id in task_ids} + self.task_norm_tmp = {task_id:0 for task_id in task_ids} + self.task_step = {task_id:0 for task_id in task_ids} + self.task_step_tmp = {task_id:0 for task_id in task_ids} + self.task_num_iters = task_num_iters + self.epochId = 0 + self.gradient_accumulation_steps = gradient_accumulation_steps + self.task_loss_val = {task_id:0 for task_id in task_ids} + self.task_score_val = {task_id:0 for task_id in task_ids} + self.task_step_val = {task_id:0 for task_id in task_ids} + self.task_datasize_val = {task_id:0 for task_id in task_ids} + + def txt_close(self): + self.txt_f.close() + + def linePlot(self, step, val, split, key, xlabel="None"): + if self.save_logger: + self.logger.add_scalar(split + "/" + key, val, step) + + def step_train(self, epochId, stepId, loss, score, norm, task_id, split): + self.task_loss[task_id] += loss + self.task_loss_tmp[task_id] += loss + self.task_score_tmp[task_id] += score + self.task_norm_tmp[task_id] += norm + self.task_step[task_id] += self.gradient_accumulation_steps + self.task_step_tmp[task_id] += self.gradient_accumulation_steps + self.epochId = epochId + + # plot on tensorboard. + self.linePlot(stepId, loss, split, self.task_id2name[task_id] + '_loss') + self.linePlot(stepId, score, split, self.task_id2name[task_id] + '_score') + + def step_val(self, epochId, loss, score, task_id, batch_size, split): + self.task_loss_val[task_id] += loss + self.task_score_val[task_id] += score + self.task_step_val[task_id] += self.gradient_accumulation_steps + self.task_datasize_val[task_id] += batch_size + + def showLossVal(self): + progressInfo = "Eval Ep: %d " %self.epochId + lossInfo = 'Validation ' + ave_score = 0 + ave_loss = 0 + for task_id in self.task_ids: + loss = self.task_loss_val[task_id] / float(self.task_step_val[task_id]) + score = self.task_score_val[task_id] / float(self.task_datasize_val[task_id]) + ave_score += score + ave_loss += loss + lossInfo += '[%s]: loss %.3f score %.3f ' %(self.task_id2name[task_id], loss, score * 100.0) + + self.linePlot(self.epochId, loss, 'val', self.task_id2name[task_id] + '_loss') + self.linePlot(self.epochId, score, 'val', self.task_id2name[task_id] + '_score') + + ave_score = ave_score / len(self.task_ids) + self.task_loss_val = {task_id:0 for task_id in self.task_loss_val} + self.task_score_val = {task_id:0 for task_id in self.task_score_val} + self.task_datasize_val = {task_id:0 for task_id in self.task_datasize_val} + self.task_step_val = {task_id:0 for task_id in self.task_ids} + logger.info(lossInfo) + print(lossInfo, file=self.txt_f) + return ave_score + + def showLossTrain(self): + # show the current loss, once showed, reset the loss. + lossInfo = '' + for task_id in self.task_ids: + if self.task_num_iters[task_id] > 0: + if self.task_step_tmp[task_id]: + lossInfo += '[%s]: iter %d Ep: %.2f loss %.3f score %.3f lr %.6g ' %(self.task_id2name[task_id], \ + self.task_step[task_id], self.task_step[task_id] / float(self.task_num_iters[task_id]), \ + self.task_loss_tmp[task_id] / float(self.task_step_tmp[task_id]), \ + self.task_score_tmp[task_id] / float(self.task_step_tmp[task_id]), \ + self.task_norm_tmp[task_id] / float(self.task_step_tmp[task_id])) + + logger.info(lossInfo) + print(lossInfo, file=self.txt_f) + + self.task_step_tmp = {task_id:0 for task_id in self.task_ids} + self.task_loss_tmp = {task_id:0 for task_id in self.task_ids} + self.task_score_tmp = {task_id:0 for task_id in self.task_ids} + self.task_norm_tmp = {task_id:0 for task_id in self.task_ids} + +def url_to_filename(url, etag=None): + """ + Convert `url` into a hashed filename in a repeatable way. + If `etag` is specified, append its hash to the url's, delimited + by a period. + """ + url_bytes = url.encode("utf-8") + url_hash = sha256(url_bytes) + filename = url_hash.hexdigest() + + if etag: + etag_bytes = etag.encode("utf-8") + etag_hash = sha256(etag_bytes) + filename += "." + etag_hash.hexdigest() + + return filename + + +def filename_to_url(filename, cache_dir=None): + """ + Return the url and etag (which may be ``None``) stored for `filename`. + Raise ``EnvironmentError`` if `filename` or its stored metadata do not exist. + """ + if cache_dir is None: + cache_dir = PYTORCH_PRETRAINED_BERT_CACHE + if sys.version_info[0] == 3 and isinstance(cache_dir, Path): + cache_dir = str(cache_dir) + + cache_path = os.path.join(cache_dir, filename) + if not os.path.exists(cache_path): + raise EnvironmentError("file {} not found".format(cache_path)) + + meta_path = cache_path + ".json" + if not os.path.exists(meta_path): + raise EnvironmentError("file {} not found".format(meta_path)) + + with open(meta_path, encoding="utf-8") as meta_file: + metadata = json.load(meta_file) + url = metadata["url"] + etag = metadata["etag"] + + return url, etag + + +def cached_path(url_or_filename, cache_dir=None): + """ + Given something that might be a URL (or might be a local path), + determine which. If it's a URL, download the file and cache it, and + return the path to the cached file. If it's already a local path, + make sure the file exists and then return the path. + """ + if cache_dir is None: + cache_dir = PYTORCH_PRETRAINED_BERT_CACHE + if sys.version_info[0] == 3 and isinstance(url_or_filename, Path): + url_or_filename = str(url_or_filename) + if sys.version_info[0] == 3 and isinstance(cache_dir, Path): + cache_dir = str(cache_dir) + + parsed = urlparse(url_or_filename) + + if parsed.scheme in ("http", "https", "s3"): + # URL, so get it from the cache (downloading if necessary) + return get_from_cache(url_or_filename, cache_dir) + elif os.path.exists(url_or_filename): + # File, and it exists. + return url_or_filename + elif parsed.scheme == "": + # File, but it doesn't exist. + raise EnvironmentError("file {} not found".format(url_or_filename)) + else: + # Something unknown + raise ValueError("unable to parse {} as a URL or as a local path".format(url_or_filename)) + +def split_s3_path(url): + """Split a full s3 path into the bucket name and path.""" + parsed = urlparse(url) + if not parsed.netloc or not parsed.path: + raise ValueError("bad s3 path {}".format(url)) + bucket_name = parsed.netloc + s3_path = parsed.path + # Remove '/' at beginning of path. + if s3_path.startswith("/"): + s3_path = s3_path[1:] + return bucket_name, s3_path + + +def s3_request(func): + """ + Wrapper function for s3 requests in order to create more helpful error + messages. + """ + + @wraps(func) + def wrapper(url, *args, **kwargs): + try: + return func(url, *args, **kwargs) + except ClientError as exc: + if int(exc.response["Error"]["Code"]) == 404: + raise EnvironmentError("file {} not found".format(url)) + else: + raise + return wrapper + +@s3_request +def s3_etag(url): + """Check ETag on S3 object.""" + s3_resource = boto3.resource("s3") + bucket_name, s3_path = split_s3_path(url) + s3_object = s3_resource.Object(bucket_name, s3_path) + return s3_object.e_tag + +@s3_request +def s3_get(url, temp_file): + """Pull a file directly from S3.""" + s3_resource = boto3.resource("s3") + bucket_name, s3_path = split_s3_path(url) + s3_resource.Bucket(bucket_name).download_fileobj(s3_path, temp_file) + +def http_get(url, temp_file): + req = requests.get(url, stream=True) + content_length = req.headers.get("Content-Length") + total = int(content_length) if content_length is not None else None + progress = tqdm(unit="B", total=total) + for chunk in req.iter_content(chunk_size=1024): + if chunk: # filter out keep-alive new chunks + progress.update(len(chunk)) + temp_file.write(chunk) + progress.close() + + +def get_from_cache(url, cache_dir=None): + """ + Given a URL, look for the corresponding dataset in the local cache. + If it's not there, download it. Then return the path to the cached file. + """ + if cache_dir is None: + cache_dir = PYTORCH_PRETRAINED_BERT_CACHE + if sys.version_info[0] == 3 and isinstance(cache_dir, Path): + cache_dir = str(cache_dir) + + if not os.path.exists(cache_dir): + os.makedirs(cache_dir) + + # Get eTag to add to filename, if it exists. + if url.startswith("s3://"): + etag = s3_etag(url) + else: + response = requests.head(url, allow_redirects=True) + if response.status_code != 200: + raise IOError( + "HEAD request failed for url {} with status code {}".format( + url, response.status_code + ) + ) + etag = response.headers.get("ETag") + + filename = url_to_filename(url, etag) + + # get cache path to put the file + cache_path = os.path.join(cache_dir, filename) + + if not os.path.exists(cache_path): + # Download to temporary file, then copy to cache dir once finished. + # Otherwise you get corrupt cache entries if the download gets interrupted. + with tempfile.NamedTemporaryFile() as temp_file: + logger.info("%s not found in cache, downloading to %s", url, temp_file.name) + + # GET file object + if url.startswith("s3://"): + s3_get(url, temp_file) + else: + http_get(url, temp_file) + + # we are copying the file before closing it, so flush to avoid truncation + temp_file.flush() + # shutil.copyfileobj() starts at the current position, so go to the start + temp_file.seek(0) + + logger.info("copying %s to cache at %s", temp_file.name, cache_path) + with open(cache_path, "wb") as cache_file: + shutil.copyfileobj(temp_file, cache_file) + + logger.info("creating metadata file for %s", cache_path) + meta = {"url": url, "etag": etag} + meta_path = cache_path + ".json" + with open(meta_path, "w", encoding="utf-8") as meta_file: + json.dump(meta, meta_file) + + logger.info("removing temp file %s", temp_file.name) + + return cache_path + + +def read_set_from_file(filename): + """ + Extract a de-duped collection (set) of text from a file. + Expected file format is one item per line. + """ + collection = set() + with open(filename, "r", encoding="utf-8") as file_: + for line in file_: + collection.add(line.rstrip()) + return collection + +def get_file_extension(path, dot=True, lower=True): + ext = os.path.splitext(path)[1] + ext = ext if dot else ext[1:] + return ext.lower() if lower else ext + diff --git a/vilbert/vilbert.py b/vilbert/vilbert.py new file mode 100644 index 0000000..d4e6a62 --- /dev/null +++ b/vilbert/vilbert.py @@ -0,0 +1,1581 @@ +# coding=utf-8 +# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team. +# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +"""PyTorch BERT model.""" + +import copy +import json +import logging +import math +import os +import shutil +import tarfile +import tempfile +import sys +from io import open + +import torch +from torch import nn +from torch.nn import CrossEntropyLoss +import torch.nn.functional as F +from torch.nn.utils.weight_norm import weight_norm + +from .utils import cached_path +import pdb + +logger = logging.getLogger(__name__) + +PRETRAINED_MODEL_ARCHIVE_MAP = { + "bert-base-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz", + "bert-large-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz", + "bert-base-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz", + "bert-large-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz", + "bert-base-multilingual-uncased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz", + "bert-base-multilingual-cased": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz", + "bert-base-chinese": "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz", +} + +def load_tf_weights_in_bert(model, tf_checkpoint_path): + """ Load tf checkpoints in a pytorch model + """ + try: + import re + import numpy as np + import tensorflow as tf + except ImportError: + print( + "Loading a TensorFlow models in PyTorch, requires TensorFlow to be installed. Please see " + "https://www.tensorflow.org/install/ for installation instructions." + ) + raise + tf_path = os.path.abspath(tf_checkpoint_path) + print("Converting TensorFlow checkpoint from {}".format(tf_path)) + # Load weights from TF model + init_vars = tf.train.list_variables(tf_path) + names = [] + arrays = [] + for name, shape in init_vars: + print("Loading TF weight {} with shape {}".format(name, shape)) + array = tf.train.load_variable(tf_path, name) + names.append(name) + arrays.append(array) + + for name, array in zip(names, arrays): + name = name.split("/") + # adam_v and adam_m are variables used in AdamWeightDecayOptimizer to calculated m and v + # which are not required for using pretrained model + if any(n in ["adam_v", "adam_m"] for n in name): + print("Skipping {}".format("/".join(name))) + continue + pointer = model + for m_name in name: + if re.fullmatch(r"[A-Za-z]+_\d+", m_name): + l = re.split(r"_(\d+)", m_name) + else: + l = [m_name] + if l[0] == "kernel" or l[0] == "gamma": + pointer = getattr(pointer, "weight") + elif l[0] == "output_bias" or l[0] == "beta": + pointer = getattr(pointer, "bias") + elif l[0] == "output_weights": + pointer = getattr(pointer, "weight") + else: + pointer = getattr(pointer, l[0]) + if len(l) >= 2: + num = int(l[1]) + pointer = pointer[num] + if m_name[-11:] == "_embeddings": + pointer = getattr(pointer, "weight") + elif m_name == "kernel": + array = np.transpose(array) + try: + assert pointer.shape == array.shape + except AssertionError as e: + e.args += (pointer.shape, array.shape) + raise + print("Initialize PyTorch weight {}".format(name)) + pointer.data = torch.from_numpy(array) + return model + + +def gelu(x): + """Implementation of the gelu activation function. + For information: OpenAI GPT's gelu is slightly different (and gives slightly different results): + 0.5 * x * (1 + torch.tanh(math.sqrt(2 / math.pi) * (x + 0.044715 * torch.pow(x, 3)))) + Also see https://arxiv.org/abs/1606.08415 + """ + return x * 0.5 * (1.0 + torch.erf(x / math.sqrt(2.0))) + + +def swish(x): + return x * torch.sigmoid(x) + + +ACT2FN = {"gelu": gelu, "relu": torch.nn.functional.relu, "swish": swish} + + +class BertConfig(object): + """Configuration class to store the configuration of a `BertModel`. + """ + + def __init__( + self, + vocab_size_or_config_json_file, + hidden_size=768, + num_hidden_layers=12, + num_attention_heads=12, + intermediate_size=3072, + hidden_act="gelu", + hidden_dropout_prob=0.1, + attention_probs_dropout_prob=0.1, + max_position_embeddings=512, + type_vocab_size=2, + initializer_range=0.02, + v_feature_size=2048, + v_target_size=1601, + v_hidden_size=768, + v_num_hidden_layers=3, + v_num_attention_heads=12, + v_intermediate_size=3072, + bi_hidden_size=1024, + bi_num_attention_heads=16, + v_attention_probs_dropout_prob=0.1, + v_hidden_act="gelu", + v_hidden_dropout_prob=0.1, + v_initializer_range=0.2, + v_biattention_id=[0, 1], + t_biattention_id=[10, 11], + predict_feature=False, + fast_mode=False, + fixed_v_layer=0, + fixed_t_layer=0, + in_batch_pairs=False, + fusion_method="mul", + intra_gate=False, + with_coattention=True + ): + + """Constructs BertConfig. + + Args: + vocab_size_or_config_json_file: Vocabulary size of `inputs_ids` in `BertModel`. + hidden_size: Size of the encoder layers and the pooler layer. + num_hidden_layers: Number of hidden layers in the Transformer encoder. + num_attention_heads: Number of attention heads for each attention layer in + the Transformer encoder. + intermediate_size: The size of the "intermediate" (i.e., feed-forward) + layer in the Transformer encoder. + hidden_act: The non-linear activation function (function or string) in the + encoder and pooler. If string, "gelu", "relu" and "swish" are supported. + hidden_dropout_prob: The dropout probabilitiy for all fully connected + layers in the embeddings, encoder, and pooler. + attention_probs_dropout_prob: The dropout ratio for the attention + probabilities. + max_position_embeddings: The maximum sequence length that this model might + ever be used with. Typically set this to something large just in case + (e.g., 512 or 1024 or 2048). + type_vocab_size: The vocabulary size of the `token_type_ids` passed into + `BertModel`. + initializer_range: The sttdev of the truncated_normal_initializer for + initializing all weight matrices. + """ + assert len(v_biattention_id) == len(t_biattention_id) + assert max(v_biattention_id) < v_num_hidden_layers + assert max(t_biattention_id) < num_hidden_layers + + if isinstance(vocab_size_or_config_json_file, str) or ( + sys.version_info[0] == 2 + and isinstance(vocab_size_or_config_json_file, unicode) + ): + with open(vocab_size_or_config_json_file, "r", encoding="utf-8") as reader: + json_config = json.loads(reader.read()) + for key, value in json_config.items(): + self.__dict__[key] = value + elif isinstance(vocab_size_or_config_json_file, int): + self.vocab_size = vocab_size_or_config_json_file + self.hidden_size = hidden_size + self.num_hidden_layers = num_hidden_layers + self.num_attention_heads = num_attention_heads + self.hidden_act = hidden_act + self.intermediate_size = intermediate_size + self.hidden_dropout_prob = hidden_dropout_prob + self.attention_probs_dropout_prob = attention_probs_dropout_prob + self.max_position_embeddings = max_position_embeddings + self.type_vocab_size = type_vocab_size + self.initializer_range = initializer_range + self.v_feature_size = v_feature_size + self.v_hidden_size = v_hidden_size + self.v_num_hidden_layers = v_num_hidden_layers + self.v_num_attention_heads = v_num_attention_heads + self.v_intermediate_size = v_intermediate_size + self.v_attention_probs_dropout_prob = v_attention_probs_dropout_prob + self.v_hidden_act = v_hidden_act + self.v_hidden_dropout_prob = v_hidden_dropout_prob + self.v_initializer_range = v_initializer_range + self.v_biattention_id = v_biattention_id + self.t_biattention_id = t_biattention_id + self.v_target_size = v_target_size + self.bi_hidden_size = bi_hidden_size + self.bi_num_attention_heads = bi_num_attention_heads + self.predict_feature = predict_feature + self.fast_mode = fast_mode + self.fixed_v_layer = fixed_v_layer + self.fixed_t_layer = fixed_t_layer + + self.in_batch_pairs = in_batch_pairs + self.fusion_method = fusion_method + self.intra_gate = intra_gate + self.with_coattention=with_coattention + else: + raise ValueError( + "First argument must be either a vocabulary size (int)" + "or the path to a pretrained model config file (str)" + ) + + @classmethod + def from_dict(cls, json_object): + """Constructs a `BertConfig` from a Python dictionary of parameters.""" + config = BertConfig(vocab_size_or_config_json_file=-1) + for key, value in json_object.items(): + config.__dict__[key] = value + return config + + @classmethod + def from_json_file(cls, json_file): + """Constructs a `BertConfig` from a json file of parameters.""" + with open(json_file, "r", encoding="utf-8") as reader: + text = reader.read() + return cls.from_dict(json.loads(text)) + + def __repr__(self): + return str(self.to_json_string()) + + def to_dict(self): + """Serializes this instance to a Python dictionary.""" + output = copy.deepcopy(self.__dict__) + return output + + def to_json_string(self): + """Serializes this instance to a JSON string.""" + return json.dumps(self.to_dict(), indent=2, sort_keys=True) + "\n" + +try: + from apex.normalization.fused_layer_norm import FusedLayerNorm as BertLayerNorm +except ImportError: + logger.info( + "Better speed can be achieved with apex installed from https://www.github.com/nvidia/apex ." + ) + + class BertLayerNorm(nn.Module): + def __init__(self, hidden_size, eps=1e-12): + """Construct a layernorm module in the TF style (epsilon inside the square root). + """ + super(BertLayerNorm, self).__init__() + self.weight = nn.Parameter(torch.ones(hidden_size)) + self.bias = nn.Parameter(torch.zeros(hidden_size)) + self.variance_epsilon = eps + + def forward(self, x): + u = x.mean(-1, keepdim=True) + s = (x - u).pow(2).mean(-1, keepdim=True) + x = (x - u) / torch.sqrt(s + self.variance_epsilon) + return self.weight * x + self.bias + +class BertEmbeddings(nn.Module): + """Construct the embeddings from word, position and token_type embeddings. + """ + + def __init__(self, config): + super(BertEmbeddings, self).__init__() + self.word_embeddings = nn.Embedding( + config.vocab_size, config.hidden_size, padding_idx=0 + ) + self.position_embeddings = nn.Embedding( + config.max_position_embeddings, config.hidden_size + ) + self.token_type_embeddings = nn.Embedding( + config.type_vocab_size, config.hidden_size + ) + + # self.LayerNorm is not snake-cased to stick with TensorFlow model variable name and be able to load + # any TensorFlow checkpoint file + self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12) + self.dropout = nn.Dropout(config.hidden_dropout_prob) + + def forward(self, input_ids, token_type_ids=None): + seq_length = input_ids.size(1) + position_ids = torch.arange( + seq_length, dtype=torch.long, device=input_ids.device + ) + position_ids = position_ids.unsqueeze(0).expand_as(input_ids) + if token_type_ids is None: + token_type_ids = torch.zeros_like(input_ids) + + words_embeddings = self.word_embeddings(input_ids) + position_embeddings = self.position_embeddings(position_ids) + token_type_embeddings = self.token_type_embeddings(token_type_ids) + + embeddings = words_embeddings + position_embeddings + token_type_embeddings + embeddings = self.LayerNorm(embeddings) + embeddings = self.dropout(embeddings) + return embeddings + +class BertSelfAttention(nn.Module): + def __init__(self, config): + super(BertSelfAttention, self).__init__() + if config.hidden_size % config.num_attention_heads != 0: + raise ValueError( + "The hidden size (%d) is not a multiple of the number of attention " + "heads (%d)" % (config.hidden_size, config.num_attention_heads) + ) + self.num_attention_heads = config.num_attention_heads + self.attention_head_size = int(config.hidden_size / config.num_attention_heads) + self.all_head_size = self.num_attention_heads * self.attention_head_size + + self.query = nn.Linear(config.hidden_size, self.all_head_size) + self.key = nn.Linear(config.hidden_size, self.all_head_size) + self.value = nn.Linear(config.hidden_size, self.all_head_size) + + self.dropout = nn.Dropout(config.attention_probs_dropout_prob) + + def transpose_for_scores(self, x): + new_x_shape = x.size()[:-1] + ( + self.num_attention_heads, + self.attention_head_size, + ) + x = x.view(*new_x_shape) + return x.permute(0, 2, 1, 3) + + def forward(self, hidden_states, attention_mask): + mixed_query_layer = self.query(hidden_states) + mixed_key_layer = self.key(hidden_states) + mixed_value_layer = self.value(hidden_states) + + query_layer = self.transpose_for_scores(mixed_query_layer) + key_layer = self.transpose_for_scores(mixed_key_layer) + value_layer = self.transpose_for_scores(mixed_value_layer) + + # Take the dot product between "query" and "key" to get the raw attention scores. + attention_scores = torch.matmul(query_layer, key_layer.transpose(-1, -2)) + attention_scores = attention_scores / math.sqrt(self.attention_head_size) + # Apply the attention mask is (precomputed for all layers in BertModel forward() function) + attention_scores = attention_scores + attention_mask + + # Normalize the attention scores to probabilities. + attention_probs = nn.Softmax(dim=-1)(attention_scores) + + # This is actually dropping out entire tokens to attend to, which might + # seem a bit unusual, but is taken from the original Transformer paper. + attention_probs = self.dropout(attention_probs) + + context_layer = torch.matmul(attention_probs, value_layer) + context_layer = context_layer.permute(0, 2, 1, 3).contiguous() + new_context_layer_shape = context_layer.size()[:-2] + (self.all_head_size,) + context_layer = context_layer.view(*new_context_layer_shape) + + return context_layer, attention_probs + + +class BertSelfOutput(nn.Module): + def __init__(self, config): + super(BertSelfOutput, self).__init__() + self.dense = nn.Linear(config.hidden_size, config.hidden_size) + self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12) + self.dropout = nn.Dropout(config.hidden_dropout_prob) + + def forward(self, hidden_states, input_tensor): + hidden_states = self.dense(hidden_states) + hidden_states = self.dropout(hidden_states) + hidden_states = self.LayerNorm(hidden_states + input_tensor) + return hidden_states + + +class BertAttention(nn.Module): + def __init__(self, config): + super(BertAttention, self).__init__() + self.self = BertSelfAttention(config) + self.output = BertSelfOutput(config) + + def forward(self, input_tensor, attention_mask): + self_output, attention_probs = self.self(input_tensor, attention_mask) + attention_output = self.output(self_output, input_tensor) + return attention_output, attention_probs + + +class BertIntermediate(nn.Module): + def __init__(self, config): + super(BertIntermediate, self).__init__() + self.dense = nn.Linear(config.hidden_size, config.intermediate_size) + if isinstance(config.hidden_act, str) or ( + sys.version_info[0] == 2 and isinstance(config.hidden_act, unicode) + ): + self.intermediate_act_fn = ACT2FN[config.hidden_act] + else: + self.intermediate_act_fn = config.hidden_act + + def forward(self, hidden_states): + hidden_states = self.dense(hidden_states) + hidden_states = self.intermediate_act_fn(hidden_states) + return hidden_states + + +class BertOutput(nn.Module): + def __init__(self, config): + super(BertOutput, self).__init__() + self.dense = nn.Linear(config.intermediate_size, config.hidden_size) + self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12) + self.dropout = nn.Dropout(config.hidden_dropout_prob) + + def forward(self, hidden_states, input_tensor): + hidden_states = self.dense(hidden_states) + hidden_states = self.dropout(hidden_states) + hidden_states = self.LayerNorm(hidden_states + input_tensor) + return hidden_states + + +class BertLayer(nn.Module): + def __init__(self, config): + super(BertLayer, self).__init__() + self.attention = BertAttention(config) + self.intermediate = BertIntermediate(config) + self.output = BertOutput(config) + + def forward(self, hidden_states, attention_mask): + attention_output, attention_probs = self.attention(hidden_states, attention_mask) + intermediate_output = self.intermediate(attention_output) + layer_output = self.output(intermediate_output, attention_output) + return layer_output, attention_probs + + +class BertImageSelfAttention(nn.Module): + def __init__(self, config): + super(BertImageSelfAttention, self).__init__() + if config.v_hidden_size % config.v_num_attention_heads != 0: + raise ValueError( + "The hidden size (%d) is not a multiple of the number of attention " + "heads (%d)" % (config.v_hidden_size, config.v_num_attention_heads) + ) + self.num_attention_heads = config.v_num_attention_heads + self.attention_head_size = int( + config.v_hidden_size / config.v_num_attention_heads + ) + self.all_head_size = self.num_attention_heads * self.attention_head_size + + self.query = nn.Linear(config.v_hidden_size, self.all_head_size) + self.key = nn.Linear(config.v_hidden_size, self.all_head_size) + self.value = nn.Linear(config.v_hidden_size, self.all_head_size) + + self.dropout = nn.Dropout(config.v_attention_probs_dropout_prob) + + def transpose_for_scores(self, x): + new_x_shape = x.size()[:-1] + ( + self.num_attention_heads, + self.attention_head_size, + ) + x = x.view(*new_x_shape) + return x.permute(0, 2, 1, 3) + + def forward(self, hidden_states, attention_mask): + mixed_query_layer = self.query(hidden_states) + mixed_key_layer = self.key(hidden_states) + mixed_value_layer = self.value(hidden_states) + + query_layer = self.transpose_for_scores(mixed_query_layer) + key_layer = self.transpose_for_scores(mixed_key_layer) + value_layer = self.transpose_for_scores(mixed_value_layer) + + # Take the dot product between "query" and "key" to get the raw attention scores. + attention_scores = torch.matmul(query_layer, key_layer.transpose(-1, -2)) + attention_scores = attention_scores / math.sqrt(self.attention_head_size) + # Apply the attention mask is (precomputed for all layers in BertModel forward() function) + attention_scores = attention_scores + attention_mask + + # Normalize the attention scores to probabilities. + attention_probs = nn.Softmax(dim=-1)(attention_scores) + + # This is actually dropping out entire tokens to attend to, which might + # seem a bit unusual, but is taken from the original Transformer paper. + attention_probs = self.dropout(attention_probs) + + context_layer = torch.matmul(attention_probs, value_layer) + context_layer = context_layer.permute(0, 2, 1, 3).contiguous() + new_context_layer_shape = context_layer.size()[:-2] + (self.all_head_size,) + context_layer = context_layer.view(*new_context_layer_shape) + + return context_layer, attention_probs + +class BertImageSelfOutput(nn.Module): + def __init__(self, config): + super(BertImageSelfOutput, self).__init__() + self.dense = nn.Linear(config.v_hidden_size, config.v_hidden_size) + self.LayerNorm = BertLayerNorm(config.v_hidden_size, eps=1e-12) + self.dropout = nn.Dropout(config.v_hidden_dropout_prob) + + def forward(self, hidden_states, input_tensor): + hidden_states = self.dense(hidden_states) + hidden_states = self.dropout(hidden_states) + hidden_states = self.LayerNorm(hidden_states + input_tensor) + return hidden_states + +class BertImageAttention(nn.Module): + def __init__(self, config): + super(BertImageAttention, self).__init__() + self.self = BertImageSelfAttention(config) + self.output = BertImageSelfOutput(config) + + def forward(self, input_tensor, attention_mask): + self_output, attention_probs = self.self(input_tensor, attention_mask) + attention_output = self.output(self_output, input_tensor) + return attention_output, attention_probs + + +class BertImageIntermediate(nn.Module): + def __init__(self, config): + super(BertImageIntermediate, self).__init__() + self.dense = nn.Linear(config.v_hidden_size, config.v_intermediate_size) + if isinstance(config.v_hidden_act, str) or ( + sys.version_info[0] == 2 and isinstance(config.v_hidden_act, unicode) + ): + self.intermediate_act_fn = ACT2FN[config.v_hidden_act] + else: + self.intermediate_act_fn = config.v_hidden_act + + def forward(self, hidden_states): + hidden_states = self.dense(hidden_states) + hidden_states = self.intermediate_act_fn(hidden_states) + return hidden_states + + +class BertImageOutput(nn.Module): + def __init__(self, config): + super(BertImageOutput, self).__init__() + self.dense = nn.Linear(config.v_intermediate_size, config.v_hidden_size) + self.LayerNorm = BertLayerNorm(config.v_hidden_size, eps=1e-12) + self.dropout = nn.Dropout(config.v_hidden_dropout_prob) + + def forward(self, hidden_states, input_tensor): + hidden_states = self.dense(hidden_states) + hidden_states = self.dropout(hidden_states) + hidden_states = self.LayerNorm(hidden_states + input_tensor) + return hidden_states + + +class BertImageLayer(nn.Module): + def __init__(self, config): + super(BertImageLayer, self).__init__() + self.attention = BertImageAttention(config) + self.intermediate = BertImageIntermediate(config) + self.output = BertImageOutput(config) + + def forward(self, hidden_states, attention_mask): + attention_output, attention_probs = self.attention(hidden_states, attention_mask) + intermediate_output = self.intermediate(attention_output) + layer_output = self.output(intermediate_output, attention_output) + return layer_output, attention_probs + + +class BertBiAttention(nn.Module): + def __init__(self, config): + super(BertBiAttention, self).__init__() + if config.bi_hidden_size % config.bi_num_attention_heads != 0: + raise ValueError( + "The hidden size (%d) is not a multiple of the number of attention " + "heads (%d)" % (config.bi_hidden_size, config.bi_num_attention_heads) + ) + + self.num_attention_heads = config.bi_num_attention_heads + self.attention_head_size = int( + config.bi_hidden_size / config.bi_num_attention_heads + ) + self.all_head_size = self.num_attention_heads * self.attention_head_size + + # self.scale = nn.Linear(1, self.num_attention_heads, bias=False) + # self.scale_act_fn = ACT2FN['relu'] + + self.query1 = nn.Linear(config.v_hidden_size, self.all_head_size) + self.key1 = nn.Linear(config.v_hidden_size, self.all_head_size) + self.value1 = nn.Linear(config.v_hidden_size, self.all_head_size) + # self.logit1 = nn.Linear(config.hidden_size, self.num_attention_heads) + + self.dropout1 = nn.Dropout(config.v_attention_probs_dropout_prob) + + self.query2 = nn.Linear(config.hidden_size, self.all_head_size) + self.key2 = nn.Linear(config.hidden_size, self.all_head_size) + self.value2 = nn.Linear(config.hidden_size, self.all_head_size) + # self.logit2 = nn.Linear(config.hidden_size, self.num_attention_heads) + + self.dropout2 = nn.Dropout(config.attention_probs_dropout_prob) + + def transpose_for_scores(self, x): + new_x_shape = x.size()[:-1] + ( + self.num_attention_heads, + self.attention_head_size, + ) + x = x.view(*new_x_shape) + return x.permute(0, 2, 1, 3) + + def forward(self, input_tensor1, attention_mask1, input_tensor2, attention_mask2, co_attention_mask=None, use_co_attention_mask=False): + + # for vision input. + mixed_query_layer1 = self.query1(input_tensor1) + mixed_key_layer1 = self.key1(input_tensor1) + mixed_value_layer1 = self.value1(input_tensor1) + # mixed_logit_layer1 = self.logit1(input_tensor1) + + query_layer1 = self.transpose_for_scores(mixed_query_layer1) + key_layer1 = self.transpose_for_scores(mixed_key_layer1) + value_layer1 = self.transpose_for_scores(mixed_value_layer1) + # logit_layer1 = self.transpose_for_logits(mixed_logit_layer1) + + # for text input: + mixed_query_layer2 = self.query2(input_tensor2) + mixed_key_layer2 = self.key2(input_tensor2) + mixed_value_layer2 = self.value2(input_tensor2) + # mixed_logit_layer2 = self.logit2(input_tensor2) + + query_layer2 = self.transpose_for_scores(mixed_query_layer2) + key_layer2 = self.transpose_for_scores(mixed_key_layer2) + value_layer2 = self.transpose_for_scores(mixed_value_layer2) + # logit_layer2 = self.transpose_for_logits(mixed_logit_layer2) + + # Take the dot product between "query2" and "key1" to get the raw attention scores for value 1. + attention_scores1 = torch.matmul(query_layer2, key_layer1.transpose(-1, -2)) + attention_scores1 = attention_scores1 / math.sqrt(self.attention_head_size) + attention_scores1 = attention_scores1 + attention_mask1 + + if use_co_attention_mask: + attention_scores1 = attention_scores1 + co_attention_mask.permute(0,1,3,2) + + # Normalize the attention scores to probabilities. + attention_probs1 = nn.Softmax(dim=-1)(attention_scores1) + + # This is actually dropping out entire tokens to attend to, which might + # seem a bit unusual, but is taken from the original Transformer paper. + attention_probs1 = self.dropout1(attention_probs1) + + context_layer1 = torch.matmul(attention_probs1, value_layer1) + context_layer1 = context_layer1.permute(0, 2, 1, 3).contiguous() + new_context_layer_shape1 = context_layer1.size()[:-2] + (self.all_head_size,) + context_layer1 = context_layer1.view(*new_context_layer_shape1) + + # Take the dot product between "query1" and "key2" to get the raw attention scores for value 2. + attention_scores2 = torch.matmul(query_layer1, key_layer2.transpose(-1, -2)) + attention_scores2 = attention_scores2 / math.sqrt(self.attention_head_size) + # Apply the attention mask is (precomputed for all layers in BertModel forward() function) + + # we can comment this line for single flow. + attention_scores2 = attention_scores2 + attention_mask2 + if use_co_attention_mask: + attention_scores2 = attention_scores2 + co_attention_mask + + # Normalize the attention scores to probabilities. + attention_probs2 = nn.Softmax(dim=-1)(attention_scores2) + + # This is actually dropping out entire tokens to attend to, which might + # seem a bit unusual, but is taken from the original Transformer paper. + attention_probs2 = self.dropout2(attention_probs2) + + context_layer2 = torch.matmul(attention_probs2, value_layer2) + context_layer2 = context_layer2.permute(0, 2, 1, 3).contiguous() + new_context_layer_shape2 = context_layer2.size()[:-2] + (self.all_head_size,) + context_layer2 = context_layer2.view(*new_context_layer_shape2) + + return context_layer1, context_layer2, (attention_probs1, attention_probs2) + +class BertBiOutput(nn.Module): + def __init__(self, config): + super(BertBiOutput, self).__init__() + + self.dense1 = nn.Linear(config.bi_hidden_size, config.v_hidden_size) + self.LayerNorm1 = BertLayerNorm(config.v_hidden_size, eps=1e-12) + self.dropout1 = nn.Dropout(config.v_hidden_dropout_prob) + + self.q_dense1 = nn.Linear(config.bi_hidden_size, config.v_hidden_size) + self.q_dropout1 = nn.Dropout(config.v_hidden_dropout_prob) + + self.dense2 = nn.Linear(config.bi_hidden_size, config.hidden_size) + self.LayerNorm2 = BertLayerNorm(config.hidden_size, eps=1e-12) + self.dropout2 = nn.Dropout(config.hidden_dropout_prob) + + self.q_dense2 = nn.Linear(config.bi_hidden_size, config.hidden_size) + self.q_dropout2 = nn.Dropout(config.hidden_dropout_prob) + + def forward(self, hidden_states1, input_tensor1, hidden_states2, input_tensor2): + + + context_state1 = self.dense1(hidden_states1) + context_state1 = self.dropout1(context_state1) + + context_state2 = self.dense2(hidden_states2) + context_state2 = self.dropout2(context_state2) + + hidden_states1 = self.LayerNorm1(context_state1 + input_tensor1) + hidden_states2 = self.LayerNorm2(context_state2 + input_tensor2) + + return hidden_states1, hidden_states2 + +class BertConnectionLayer(nn.Module): + def __init__(self, config): + super(BertConnectionLayer, self).__init__() + self.biattention = BertBiAttention(config) + + self.biOutput = BertBiOutput(config) + + self.v_intermediate = BertImageIntermediate(config) + self.v_output = BertImageOutput(config) + + self.t_intermediate = BertIntermediate(config) + self.t_output = BertOutput(config) + + def forward(self, input_tensor1, attention_mask1, input_tensor2, attention_mask2, co_attention_mask=None, use_co_attention_mask=False): + + bi_output1, bi_output2, co_attention_probs = self.biattention( + input_tensor1, attention_mask1, input_tensor2, attention_mask2, co_attention_mask, use_co_attention_mask + ) + + attention_output1, attention_output2 = self.biOutput(bi_output2, input_tensor1, bi_output1, input_tensor2) + + intermediate_output1 = self.v_intermediate(attention_output1) + layer_output1 = self.v_output(intermediate_output1, attention_output1) + + intermediate_output2 = self.t_intermediate(attention_output2) + layer_output2 = self.t_output(intermediate_output2, attention_output2) + + return layer_output1, layer_output2, co_attention_probs + +class BertEncoder(nn.Module): + def __init__(self, config): + super(BertEncoder, self).__init__() + + # in the bert encoder, we need to extract three things here. + # text bert layer: BertLayer + # vision bert layer: BertImageLayer + # Bi-Attention: Given the output of two bertlayer, perform bi-directional + # attention and add on two layers. + + self.FAST_MODE = config.fast_mode + self.with_coattention = config.with_coattention + self.v_biattention_id = config.v_biattention_id + self.t_biattention_id = config.t_biattention_id + self.in_batch_pairs = config.in_batch_pairs + self.fixed_t_layer = config.fixed_t_layer + self.fixed_v_layer = config.fixed_v_layer + layer = BertLayer(config) + v_layer = BertImageLayer(config) + connect_layer = BertConnectionLayer(config) + + self.layer = nn.ModuleList( + [copy.deepcopy(layer) for _ in range(config.num_hidden_layers)] + ) + self.v_layer = nn.ModuleList( + [copy.deepcopy(v_layer) for _ in range(config.v_num_hidden_layers)] + ) + self.c_layer = nn.ModuleList( + [copy.deepcopy(connect_layer) for _ in range(len(config.v_biattention_id))] + ) + + def forward( + self, + txt_embedding, + image_embedding, + txt_attention_mask, + image_attention_mask, + co_attention_mask=None, + output_all_encoded_layers=True, + output_all_attention_masks=False, + ): + + v_start = 0 + t_start = 0 + count = 0 + all_encoder_layers_t = [] + all_encoder_layers_v = [] + + all_attention_mask_t = [] + all_attnetion_mask_v = [] + all_attention_mask_c = [] + + batch_size, num_words, t_hidden_size = txt_embedding.size() + _, num_regions, v_hidden_size = image_embedding.size() + + use_co_attention_mask = False + for v_layer_id, t_layer_id in zip(self.v_biattention_id, self.t_biattention_id): + + v_end = v_layer_id + t_end = t_layer_id + + assert self.fixed_t_layer <= t_end + assert self.fixed_v_layer <= v_end + + for idx in range(v_start, self.fixed_v_layer): + with torch.no_grad(): + image_embedding, image_attention_probs = self.v_layer[idx](image_embedding, image_attention_mask) + v_start = self.fixed_v_layer + + if output_all_attention_masks: + all_attnetion_mask_v.append(image_attention_probs) + + for idx in range(v_start, v_end): + image_embedding, image_attention_probs = self.v_layer[idx](image_embedding, image_attention_mask) + + if output_all_attention_masks: + all_attnetion_mask_v.append(image_attention_probs) + + for idx in range(t_start, self.fixed_t_layer): + with torch.no_grad(): + txt_embedding, txt_attention_probs = self.layer[idx](txt_embedding, txt_attention_mask) + t_start = self.fixed_t_layer + if output_all_attention_masks: + all_attention_mask_t.append(txt_attention_probs) + + for idx in range(t_start, t_end): + txt_embedding, txt_attention_probs = self.layer[idx](txt_embedding, txt_attention_mask) + if output_all_attention_masks: + all_attention_mask_t.append(txt_attention_probs) + + if count == 0 and self.in_batch_pairs: + # new batch size is the batch_size ^2 + image_embedding = image_embedding.unsqueeze(0).expand(batch_size, batch_size, num_regions, v_hidden_size).contiguous().view(batch_size*batch_size, num_regions, v_hidden_size) + image_attention_mask = image_attention_mask.unsqueeze(0).expand(batch_size, batch_size, 1, 1, num_regions).contiguous().view(batch_size*batch_size, 1, 1, num_regions) + + txt_embedding = txt_embedding.unsqueeze(1).expand(batch_size, batch_size, num_words, t_hidden_size).contiguous().view(batch_size*batch_size, num_words, t_hidden_size) + txt_attention_mask = txt_attention_mask.unsqueeze(1).expand(batch_size, batch_size, 1, 1, num_words).contiguous().view(batch_size*batch_size, 1, 1, num_words) + co_attention_mask = co_attention_mask.unsqueeze(1).expand(batch_size, batch_size, 1, num_regions, num_words).contiguous().view(batch_size*batch_size, 1, num_regions, num_words) + + if count == 0 and self.FAST_MODE: + txt_embedding = txt_embedding.expand(image_embedding.size(0), txt_embedding.size(1), txt_embedding.size(2)) + txt_attention_mask = txt_attention_mask.expand(image_embedding.size(0), txt_attention_mask.size(1), txt_attention_mask.size(2), txt_attention_mask.size(3)) + + if self.with_coattention: + # do the bi attention. + image_embedding, txt_embedding, co_attention_probs = self.c_layer[count]( + image_embedding, image_attention_mask, txt_embedding, txt_attention_mask, co_attention_mask, use_co_attention_mask) + + # use_co_attention_mask = False + if output_all_attention_masks: + all_attention_mask_c.append(co_attention_probs) + + v_start = v_end + t_start = t_end + count += 1 + + if output_all_encoded_layers: + all_encoder_layers_t.append(txt_embedding) + all_encoder_layers_v.append(image_embedding) + + for idx in range(v_start, len(self.v_layer)): + image_embedding, image_attention_probs = self.v_layer[idx](image_embedding, image_attention_mask) + + if output_all_attention_masks: + all_attnetion_mask_v.append(image_attention_probs) + + for idx in range(t_start, len(self.layer)): + txt_embedding, txt_attention_probs = self.layer[idx](txt_embedding, txt_attention_mask) + + if output_all_attention_masks: + all_attention_mask_t.append(txt_attention_probs) + + # add the end part to finish. + if not output_all_encoded_layers: + all_encoder_layers_t.append(txt_embedding) + all_encoder_layers_v.append(image_embedding) + + return all_encoder_layers_t, all_encoder_layers_v, (all_attention_mask_t, all_attnetion_mask_v, all_attention_mask_c) + + +class BertTextPooler(nn.Module): + def __init__(self, config): + super(BertTextPooler, self).__init__() + self.dense = nn.Linear(config.hidden_size, config.bi_hidden_size) + self.activation = nn.ReLU() + + def forward(self, hidden_states): + # We "pool" the model by simply taking the hidden state corresponding + # to the first token. + first_token_tensor = hidden_states[:, 0] + pooled_output = self.dense(first_token_tensor) + pooled_output = self.activation(pooled_output) + return pooled_output + + +class BertImagePooler(nn.Module): + def __init__(self, config): + super(BertImagePooler, self).__init__() + self.dense = nn.Linear(config.v_hidden_size, config.bi_hidden_size) + self.activation = nn.ReLU() + + def forward(self, hidden_states): + # We "pool" the model by simply taking the hidden state corresponding + # to the first token. + first_token_tensor = hidden_states[:, 0] + pooled_output = self.dense(first_token_tensor) + pooled_output = self.activation(pooled_output) + return pooled_output + + +class BertPredictionHeadTransform(nn.Module): + def __init__(self, config): + super(BertPredictionHeadTransform, self).__init__() + self.dense = nn.Linear(config.hidden_size, config.hidden_size) + if isinstance(config.hidden_act, str) or ( + sys.version_info[0] == 2 and isinstance(config.hidden_act, unicode) + ): + self.transform_act_fn = ACT2FN[config.hidden_act] + else: + self.transform_act_fn = config.hidden_act + self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12) + + def forward(self, hidden_states): + hidden_states = self.dense(hidden_states) + hidden_states = self.transform_act_fn(hidden_states) + hidden_states = self.LayerNorm(hidden_states) + return hidden_states + + +class BertImgPredictionHeadTransform(nn.Module): + def __init__(self, config): + super(BertImgPredictionHeadTransform, self).__init__() + self.dense = nn.Linear(config.v_hidden_size, config.v_hidden_size) + if isinstance(config.hidden_act, str) or ( + sys.version_info[0] == 2 and isinstance(config.hidden_act, unicode) + ): + self.transform_act_fn = ACT2FN[config.hidden_act] + else: + self.transform_act_fn = config.v_hidden_act + self.LayerNorm = BertLayerNorm(config.v_hidden_size, eps=1e-12) + + def forward(self, hidden_states): + hidden_states = self.dense(hidden_states) + hidden_states = self.transform_act_fn(hidden_states) + hidden_states = self.LayerNorm(hidden_states) + return hidden_states + + +class BertLMPredictionHead(nn.Module): + def __init__(self, config, bert_model_embedding_weights): + super(BertLMPredictionHead, self).__init__() + self.transform = BertPredictionHeadTransform(config) + + # The output weights are the same as the input embeddings, but there is + # an output-only bias for each token. + self.decoder = nn.Linear( + bert_model_embedding_weights.size(1), + bert_model_embedding_weights.size(0), + bias=False, + ) + self.decoder.weight = bert_model_embedding_weights + self.bias = nn.Parameter(torch.zeros(bert_model_embedding_weights.size(0))) + + def forward(self, hidden_states): + hidden_states = self.transform(hidden_states) + hidden_states = self.decoder(hidden_states) + self.bias + return hidden_states + + +class BertOnlyMLMHead(nn.Module): + def __init__(self, config, bert_model_embedding_weights): + super(BertOnlyMLMHead, self).__init__() + self.predictions = BertLMPredictionHead(config, bert_model_embedding_weights) + + def forward(self, sequence_output): + prediction_scores = self.predictions(sequence_output) + return prediction_scores + + +class BertOnlyNSPHead(nn.Module): + def __init__(self, config): + super(BertOnlyNSPHead, self).__init__() + self.seq_relationship = nn.Linear(config.hidden_size, 2) + + def forward(self, pooled_output): + seq_relationship_score = self.seq_relationship(pooled_output) + return seq_relationship_score + + +class BertPreTrainingHeads(nn.Module): + def __init__(self, config, bert_model_embedding_weights): + super(BertPreTrainingHeads, self).__init__() + self.predictions = BertLMPredictionHead(config, bert_model_embedding_weights) + self.bi_seq_relationship = nn.Linear(config.bi_hidden_size, 2) + self.imagePredictions = BertImagePredictionHead(config) + self.fusion_method = config.fusion_method + self.dropout = nn.Dropout(0.1) + + def forward( + self, sequence_output_t, sequence_output_v, pooled_output_t, pooled_output_v + ): + + if self.fusion_method == 'sum': + pooled_output = self.dropout(pooled_output_t + pooled_output_v) + elif self.fusion_method == 'mul': + pooled_output = self.dropout(pooled_output_t * pooled_output_v) + else: + assert False + + prediction_scores_t = self.predictions(sequence_output_t) + seq_relationship_score = self.bi_seq_relationship(pooled_output) + prediction_scores_v = self.imagePredictions(sequence_output_v) + + return prediction_scores_t, prediction_scores_v, seq_relationship_score + + +class BertImagePredictionHead(nn.Module): + def __init__(self, config): + super(BertImagePredictionHead, self).__init__() + self.transform = BertImgPredictionHeadTransform(config) + + # The output weights are the same as the input embeddings, but there is + # an output-only bias for each token. + self.decoder = nn.Linear(config.v_hidden_size, config.v_target_size) + + def forward(self, hidden_states): + hidden_states = self.transform(hidden_states) + hidden_states = self.decoder(hidden_states) + return hidden_states + + +class BertPreTrainedModel(nn.Module): + """ An abstract class to handle weights initialization and + a simple interface for dowloading and loading pretrained models. + """ + + def __init__(self, config, default_gpu=True, *inputs, **kwargs): + super(BertPreTrainedModel, self).__init__() + + if not isinstance(config, BertConfig): + raise ValueError( + "Parameter config in `{}(config)` should be an instance of class `BertConfig`. " + "To create a model from a Google pretrained model use " + "`model = {}.from_pretrained(PRETRAINED_MODEL_NAME)`".format( + self.__class__.__name__, self.__class__.__name__ + ) + ) + + self.config = config + + def init_bert_weights(self, module): + """ Initialize the weights. + """ + if isinstance(module, (nn.Linear, nn.Embedding)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + module.weight.data.normal_(mean=0.0, std=self.config.initializer_range) + elif isinstance(module, BertLayerNorm): + module.bias.data.zero_() + module.weight.data.fill_(1.0) + if isinstance(module, nn.Linear) and module.bias is not None: + module.bias.data.zero_() + + @classmethod + def from_pretrained( + cls, + pretrained_model_name_or_path, + config, + default_gpu=True, + state_dict=None, + cache_dir=None, + from_tf=False, + *inputs, + **kwargs + ): + """ + Instantiate a BertPreTrainedModel from a pre-trained model file or a pytorch state dict. + Download and cache the pre-trained model file if needed. + + Params: + pretrained_model_name_or_path: either: + - a str with the name of a pre-trained model to load selected in the list of: + . `bert-base-uncased` + . `bert-large-uncased` + . `bert-base-cased` + . `bert-large-cased` + . `bert-base-multilingual-uncased` + . `bert-base-multilingual-cased` + . `bert-base-chinese` + - a path or url to a pretrained model archive containing: + . `bert_config.json` a configuration file for the model + . `pytorch_model.bin` a PyTorch dump of a BertForPreTraining instance + - a path or url to a pretrained model archive containing: + . `bert_config.json` a configuration file for the model + . `model.chkpt` a TensorFlow checkpoint + from_tf: should we load the weights from a locally saved TensorFlow checkpoint + cache_dir: an optional path to a folder in which the pre-trained models will be cached. + state_dict: an optional state dictionnary (collections.OrderedDict object) to use instead of Google pre-trained models + *inputs, **kwargs: additional input for the specific Bert class + (ex: num_labels for BertForSequenceClassification) + """ + CONFIG_NAME = "bert_config.json" + WEIGHTS_NAME = "pytorch_model.bin" + TF_WEIGHTS_NAME = "model.ckpt" + + if pretrained_model_name_or_path in PRETRAINED_MODEL_ARCHIVE_MAP: + archive_file = PRETRAINED_MODEL_ARCHIVE_MAP[pretrained_model_name_or_path] + else: + archive_file = pretrained_model_name_or_path + # redirect to the cache, if necessary + try: + resolved_archive_file = cached_path(archive_file, cache_dir=cache_dir) + except EnvironmentError: + logger.error( + "Model name '{}' was not found in model name list ({}). " + "We assumed '{}' was a path or url but couldn't find any file " + "associated to this path or url.".format( + pretrained_model_name_or_path, + ", ".join(PRETRAINED_MODEL_ARCHIVE_MAP.keys()), + archive_file, + ) + ) + return None + + if default_gpu: + if resolved_archive_file == archive_file: + logger.info("loading archive file {}".format(archive_file)) + else: + logger.info( + "loading archive file {} from cache at {}".format( + archive_file, resolved_archive_file + ) + ) + tempdir = None + if os.path.isdir(resolved_archive_file) or from_tf: + serialization_dir = resolved_archive_file + elif resolved_archive_file[-3:] == 'bin': + serialization_dir = '/'.join(resolved_archive_file.split('/')[:-1]) + WEIGHTS_NAME = resolved_archive_file.split('/')[-1] + else: + # Extract archive to temp dir + tempdir = tempfile.mkdtemp() + logger.info( + "extracting archive file {} to temp dir {}".format( + resolved_archive_file, tempdir + ) + ) + with tarfile.open(resolved_archive_file, "r:gz") as archive: + archive.extractall(tempdir) + serialization_dir = tempdir + # Load config + # config_file = os.path.join(serialization_dir, CONFIG_NAME) + # config = BertConfig.from_json_file(config_file) + if default_gpu: + logger.info("Model config {}".format(config)) + # Instantiate model. + model = cls(config, *inputs, **kwargs) + if state_dict is None and not from_tf: + weights_path = os.path.join(serialization_dir, WEIGHTS_NAME) + state_dict = torch.load( + weights_path, + map_location="cpu", + ) + if 'state_dict' in dir(state_dict): + state_dict = state_dict.state_dict() + + if tempdir: + # Clean up temp dir + shutil.rmtree(tempdir) + if from_tf: + # Directly load from a TensorFlow checkpoint + weights_path = os.path.join(serialization_dir, TF_WEIGHTS_NAME) + return load_tf_weights_in_bert(model, weights_path) + # Load from a PyTorch state_dict + old_keys = [] + new_keys = [] + for key in state_dict.keys(): + new_key = None + if "gamma" in key: + new_key = key.replace("gamma", "weight") + if "beta" in key: + new_key = key.replace("beta", "bias") + if new_key: + old_keys.append(key) + new_keys.append(new_key) + for old_key, new_key in zip(old_keys, new_keys): + state_dict[new_key] = state_dict.pop(old_key) + + missing_keys = [] + unexpected_keys = [] + error_msgs = [] + # copy state_dict so _load_from_state_dict can modify it + metadata = getattr(state_dict, "_metadata", None) + state_dict = state_dict.copy() + if metadata is not None: + state_dict._metadata = metadata + + def load(module, prefix=""): + local_metadata = {} if metadata is None else metadata.get(prefix[:-1], {}) + module._load_from_state_dict( + state_dict, + prefix, + local_metadata, + True, + missing_keys, + unexpected_keys, + error_msgs, + ) + for name, child in module._modules.items(): + if child is not None: + load(child, prefix + name + ".") + + start_prefix = "" + if not hasattr(model, "bert") and any( + s.startswith("bert.") for s in state_dict.keys() + ): + start_prefix = "bert." + load(model, prefix=start_prefix) + if len(missing_keys) > 0 and default_gpu: + logger.info( + "Weights of {} not initialized from pretrained model: {}".format( + model.__class__.__name__, missing_keys + ) + ) + if len(unexpected_keys) > 0 and default_gpu: + logger.info( + "Weights from pretrained model not used in {}: {}".format( + model.__class__.__name__, unexpected_keys + ) + ) + if len(error_msgs) > 0 and default_gpu: + raise RuntimeError( + "Error(s) in loading state_dict for {}:\n\t{}".format( + model.__class__.__name__, "\n\t".join(error_msgs) + ) + ) + return model + + +class BertModel(BertPreTrainedModel): + """BERT model ("Bidirectional Embedding Representations from a Transformer"). + + Params: + config: a BertConfig class instance with the configuration to build a new model + + Inputs: + `input_ids`: a torch.LongTensor of shape [batch_size, sequence_length] + with the word token indices in the vocabulary(see the tokens preprocessing logic in the scripts + `extract_features.py`, `run_classifier.py` and `run_squad.py`) + `token_type_ids`: an optional torch.LongTensor of shape [batch_size, sequence_length] with the token + types indices selected in [0, 1]. Type 0 corresponds to a `sentence A` and type 1 corresponds to + a `sentence B` token (see BERT paper for more details). + `attention_mask`: an optional torch.LongTensor of shape [batch_size, sequence_length] with indices + selected in [0, 1]. It's a mask to be used if the input sequence length is smaller than the max + input sequence length in the current batch. It's the mask that we typically use for attention when + a batch has varying length sentences. + `output_all_encoded_layers`: boolean which controls the content of the `encoded_layers` output as described below. Default: `True`. + + Outputs: Tuple of (encoded_layers, pooled_output) + `encoded_layers`: controled by `output_all_encoded_layers` argument: + - `output_all_encoded_layers=True`: outputs a list of the full sequences of encoded-hidden-states at the end + of each attention block (i.e. 12 full sequences for BERT-base, 24 for BERT-large), each + encoded-hidden-state is a torch.FloatTensor of size [batch_size, sequence_length, hidden_size], + - `output_all_encoded_layers=False`: outputs only the full sequence of hidden-states corresponding + to the last attention block of shape [batch_size, sequence_length, hidden_size], + `pooled_output`: a torch.FloatTensor of size [batch_size, hidden_size] which is the output of a + classifier pretrained on top of the hidden state associated to the first character of the + input (`CLS`) to train on the Next-Sentence task (see BERT's paper). + + Example usage: + ```python + # Already been converted into WordPiece token ids + input_ids = torch.LongTensor([[31, 51, 99], [15, 5, 0]]) + input_mask = torch.LongTensor([[1, 1, 1], [1, 1, 0]]) + token_type_ids = torch.LongTensor([[0, 0, 1], [0, 1, 0]]) + + config = modeling.BertConfig(vocab_size_or_config_json_file=32000, hidden_size=768, + num_hidden_layers=12, num_attention_heads=12, intermediate_size=3072) + + model = modeling.BertModel(config=config) + all_encoder_layers, pooled_output = model(input_ids, token_type_ids, input_mask) + ``` + """ + + def __init__(self, config): + super(BertModel, self).__init__(config) + + # initilize word embedding + self.embeddings = BertEmbeddings(config) + + # initlize the vision embedding + self.v_embeddings = BertImageEmbeddings(config) + + self.encoder = BertEncoder(config) + self.t_pooler = BertTextPooler(config) + self.v_pooler = BertImagePooler(config) + + self.apply(self.init_bert_weights) + + def forward( + self, + input_txt, + input_imgs, + image_loc, + token_type_ids=None, + attention_mask=None, + image_attention_mask=None, + co_attention_mask=None, + output_all_encoded_layers=False, + output_all_attention_masks=False, + ): + if attention_mask is None: + attention_mask = torch.ones_like(input_txt) + if token_type_ids is None: + token_type_ids = torch.zeros_like(input_txt) + if image_attention_mask is None: + image_attention_mask = torch.ones( + input_imgs.size(0), input_imgs.size(1) + ).type_as(input_txt) + + # We create a 3D attention mask from a 2D tensor mask. + # Sizes are [batch_size, 1, 1, to_seq_length] + # So we can broadcast to [batch_size, num_heads, from_seq_length, to_seq_length] + # this attention mask is more simple than the triangular masking of causal attention + # used in OpenAI GPT, we just need to prepare the broadcast dimension here. + extended_attention_mask = attention_mask.unsqueeze(1).unsqueeze(2) + extended_image_attention_mask = image_attention_mask.unsqueeze(1).unsqueeze(2) + + # Since attention_mask is 1.0 for positions we want to attend and 0.0 for + # masked positions, this operation will create a tensor which is 0.0 for + # positions we want to attend and -10000.0 for masked positions. + # Since we are adding it to the raw scores before the softmax, this is + # effectively the same as removing these entirely. + extended_attention_mask = extended_attention_mask.to( + dtype=next(self.parameters()).dtype + ) # fp16 compatibility + extended_attention_mask = (1.0 - extended_attention_mask) * -10000.0 + + extended_image_attention_mask = extended_image_attention_mask.to( + dtype=next(self.parameters()).dtype + ) # fp16 compatibility + extended_image_attention_mask = (1.0 - extended_image_attention_mask) * -10000.0 + + if co_attention_mask is None: + co_attention_mask = torch.zeros(input_txt.size(0), input_imgs.size(1), input_txt.size(1)).type_as(extended_image_attention_mask) + + extended_co_attention_mask = co_attention_mask.unsqueeze(1) + + # extended_co_attention_mask = co_attention_mask.unsqueeze(-1) + extended_co_attention_mask = extended_co_attention_mask * 5.0 + extended_co_attention_mask = extended_co_attention_mask.to( + dtype=next(self.parameters()).dtype + ) # fp16 compatibility + + embedding_output = self.embeddings(input_txt, token_type_ids) + v_embedding_output = self.v_embeddings(input_imgs, image_loc) + + encoded_layers_t, encoded_layers_v, all_attention_mask = self.encoder( + embedding_output, + v_embedding_output, + extended_attention_mask, + extended_image_attention_mask, + extended_co_attention_mask, + output_all_encoded_layers=output_all_encoded_layers, + output_all_attention_masks=output_all_attention_masks, + ) + + sequence_output_t = encoded_layers_t[-1] + sequence_output_v = encoded_layers_v[-1] + + pooled_output_t = self.t_pooler(sequence_output_t) + pooled_output_v = self.v_pooler(sequence_output_v) + + if not output_all_encoded_layers: + encoded_layers_t = encoded_layers_t[-1] + encoded_layers_v = encoded_layers_v[-1] + + return encoded_layers_t, encoded_layers_v, pooled_output_t, pooled_output_v, all_attention_mask + + +class BertImageEmbeddings(nn.Module): + """Construct the embeddings from image, spatial location (omit now) and token_type embeddings. + """ + def __init__(self, config): + super(BertImageEmbeddings, self).__init__() + + self.image_embeddings = nn.Linear(config.v_feature_size, config.v_hidden_size) + self.image_location_embeddings = nn.Linear(5, config.v_hidden_size) + self.LayerNorm = BertLayerNorm(config.v_hidden_size, eps=1e-12) + self.dropout = nn.Dropout(config.hidden_dropout_prob) + + def forward(self, input_ids, input_loc): + + img_embeddings = self.image_embeddings(input_ids) + loc_embeddings = self.image_location_embeddings(input_loc) + embeddings = self.LayerNorm(img_embeddings+loc_embeddings) + embeddings = self.dropout(embeddings) + + return embeddings + + +class BertForMultiModalPreTraining(BertPreTrainedModel): + """BERT model with multi modal pre-training heads. + """ + + def __init__(self, config): + super(BertForMultiModalPreTraining, self).__init__(config) + + self.bert = BertModel(config) + self.cls = BertPreTrainingHeads( + config, self.bert.embeddings.word_embeddings.weight + ) + + self.apply(self.init_bert_weights) + self.predict_feature = config.predict_feature + self.loss_fct = CrossEntropyLoss(ignore_index=-1) + + print("model's option for predict_feature is ", config.predict_feature) + + if self.predict_feature: + self.vis_criterion = nn.MSELoss(reduction="none") + else: + self.vis_criterion = nn.KLDivLoss(reduction="none") + + def forward( + self, + input_ids, + image_feat, + image_loc, + token_type_ids=None, + attention_mask=None, + image_attention_mask=None, + masked_lm_labels=None, + image_label=None, + image_target = None, + next_sentence_label=None, + output_all_attention_masks=False + ): + + # in this model, we first embed the images. + sequence_output_t, sequence_output_v, pooled_output_t, pooled_output_v, all_attention_mask = self.bert( + input_ids, + image_feat, + image_loc, + token_type_ids, + attention_mask, + image_attention_mask, + output_all_encoded_layers=False, + output_all_attention_masks=output_all_attention_masks + ) + + prediction_scores_t, prediction_scores_v, seq_relationship_score = self.cls( + sequence_output_t, sequence_output_v, pooled_output_t, pooled_output_v + ) + + if masked_lm_labels is not None and next_sentence_label is not None and image_target is not None: + + prediction_scores_v = prediction_scores_v[:, 1:] + if self.predict_feature: + img_loss = self.vis_criterion(prediction_scores_v, image_target) + masked_img_loss = torch.sum( + img_loss * (image_label == 1).unsqueeze(2).float() + ) / max(torch.sum((image_label == 1).unsqueeze(2).expand_as(img_loss)),1) + + else: + img_loss = self.vis_criterion( + F.log_softmax(prediction_scores_v, dim=2), image_target + ) + masked_img_loss = torch.sum( + img_loss * (image_label == 1).unsqueeze(2).float() + ) / max(torch.sum((image_label == 1)), 0) + + # masked_img_loss = torch.sum(img_loss) / (img_loss.shape[0] * img_loss.shape[1]) + masked_lm_loss = self.loss_fct( + prediction_scores_t.view(-1, self.config.vocab_size), + masked_lm_labels.view(-1), + ) + next_sentence_loss = self.loss_fct( + seq_relationship_score.view(-1, 2), next_sentence_label.view(-1) + ) + # total_loss = masked_lm_loss + next_sentence_loss + masked_img_loss + return masked_lm_loss.unsqueeze(0), masked_img_loss.unsqueeze(0), next_sentence_loss.unsqueeze(0) + else: + return prediction_scores_t, prediction_scores_v, seq_relationship_score, all_attention_mask + +class VILBertForVLTasks(BertPreTrainedModel): + def __init__(self, config, num_labels, dropout_prob=0.1, default_gpu=True): + super(VILBertForVLTasks, self).__init__(config) + self.num_labels = num_labels + self.bert = BertModel(config) + self.dropout = nn.Dropout(dropout_prob) + self.cls = BertPreTrainingHeads( + config, self.bert.embeddings.word_embeddings.weight + ) + self.vil_prediction = SimpleClassifier(config.bi_hidden_size, config.bi_hidden_size*2, num_labels, 0.5) + # self.vil_prediction = nn.Linear(config.bi_hidden_size, num_labels) + self.vil_logit = nn.Linear(config.bi_hidden_size, 1) + self.vision_logit = nn.Linear(config.v_hidden_size, 1) + self.linguisic_logit = nn.Linear(config.hidden_size, 1) + self.fusion_method = config.fusion_method + self.apply(self.init_bert_weights) + + def forward( + self, + input_txt, + input_imgs, + image_loc, + token_type_ids=None, + attention_mask=None, + image_attention_mask=None, + co_attention_mask=None, + output_all_encoded_layers=False, + ): + sequence_output_t, sequence_output_v, pooled_output_t, pooled_output_v, _ = self.bert( + input_txt, + input_imgs, + image_loc, + token_type_ids, + attention_mask, + image_attention_mask, + co_attention_mask, + output_all_encoded_layers=False, + ) + + vil_prediction = 0 + vil_logit = 0 + vil_binary_prediction = 0 + vision_prediction = 0 + vision_logit = 0 + linguisic_prediction = 0 + linguisic_logit = 0 + + linguisic_prediction, vision_prediction, vil_binary_prediction = self.cls( + sequence_output_t, sequence_output_v, pooled_output_t, pooled_output_v + ) + + if self.fusion_method == 'sum': + pooled_output = self.dropout(pooled_output_t + pooled_output_v) + elif self.fusion_method == 'mul': + pooled_output = self.dropout(pooled_output_t * pooled_output_v) + else: + assert False + + vil_prediction = self.vil_prediction(pooled_output) + vil_logit = self.vil_logit(pooled_output) + vision_logit = self.vision_logit(self.dropout(sequence_output_v)) + ((1.0 - image_attention_mask)* -10000.0).unsqueeze(2).to(dtype=next(self.parameters()).dtype) + linguisic_logit = self.linguisic_logit(self.dropout(sequence_output_t)) + + return vil_prediction, vil_logit, vil_binary_prediction, vision_prediction, vision_logit, linguisic_prediction, linguisic_logit + +class SimpleClassifier(nn.Module): + def __init__(self, in_dim, hid_dim, out_dim, dropout): + super(SimpleClassifier, self).__init__() + layers = [ + weight_norm(nn.Linear(in_dim, hid_dim), dim=None), + nn.ReLU(), + nn.Dropout(dropout, inplace=True), + weight_norm(nn.Linear(hid_dim, out_dim), dim=None) + ] + self.main = nn.Sequential(*layers) + + def forward(self, x): + logits = self.main(x) + return logits \ No newline at end of file diff --git a/vlbert_tasks.yml b/vlbert_tasks.yml new file mode 100644 index 0000000..ec82410 --- /dev/null +++ b/vlbert_tasks.yml @@ -0,0 +1,228 @@ +TASK0: + name: ConceptualCaption + type: L-classifier + task_id: 0 + features_h5path: + train_file: + max_seq_length: 30 + batch_size: 128 + train_split: train + eval_split: val +TASK1: + name: VQA + type: VL-classifier + loss: BCEWithLogitLoss + task_id: 1 + dataroot: data/VQA + features_h5path1: data/coco/coco_trainval_resnet101_faster_rcnn_genome.lmdb + features_h5path2: '' + train_annotations_jsonpath: '' + val_annotations_jsonpath: '' + max_seq_length: 16 + max_region_num: 100 + batch_size: 256 + eval_batch_size: 1024 + train_split: trainval + val_split: minval + lr: 0.00004 + num_epoch: 20 +TASK2: + name: VQA-MC + type: VL-logit + loss: CrossEntropyLoss + task_id: 2 + dataroot: data/VQA + features_h5path1: data/coco/coco_trainval_resnet101_faster_rcnn_genome.lmdb + features_h5path2: '' + train_annotations_jsonpath: '' + val_annotations_jsonpath: '' + max_seq_length: 20 + max_region_num: 100 + batch_size: 128 + train_split: trainval + val_split: minval +TASK3: + name: GenomeQA + type: VL-classifier + task_id: 3 + features_h5path: data/coco/coco_trainval.h5 + train_file: data/VQA/ + split: train +TASK4: + name: VisualDialog + type: VL-logit + loss: CrossEntropyLoss + task_id: 4 + dataroot: data/VisualDialog + features_h5path1: data/coco/coco_trainval.h5 + features_h5path2: '' + train_annotations_jsonpath: 'data/VisualDialog/visdial_1.0_train.json' + val_annotations_jsonpath: 'data/VisualDialog/visdial_1.0_val.json' + max_seq_length: 60 + max_region_num: 100 + batch_size: 64 + train_split: train + val_split: val + lr: 0.00001 + num_epoch: 20 +TASK5: + name: Vissual7w + type: VL-logit + task_id: 5 + split: train +TASK6: + name: VCR_Q-A + type: VL-logit + loss: CrossEntropyLoss + task_id: 6 + dataroot: data/VCR + features_h5path1: /srv/datasets/vilbert/VCR/VCR_resnet101_faster_rcnn_genome.lmdb + features_h5path2: /srv/datasets/vilbert/VCR/VCR_gt_resnet101_faster_rcnn_genome.lmdb + train_annotations_jsonpath: data/VCR/train.jsonl + val_annotations_jsonpath: data/VCR/val.jsonl + max_seq_length: 60 + max_region_num: 100 + batch_size: 64 + train_split: train + val_split: val + lr: 0.00002 + num_epoch: 20 +TASK7: + name: VCR_QA-R + type: VL-logit + loss: CrossEntropyLoss + task_id: 7 + dataroot: data/VCR + features_h5path1: /srv/datasets/vilbert/VCR/VCR_resnet101_faster_rcnn_genome.lmdb + features_h5path2: /srv/datasets/vilbert/VCR/VCR_gt_resnet101_faster_rcnn_genome.lmdb + train_annotations_jsonpath: data/VCR/train.jsonl + val_annotations_jsonpath: data/VCR/val.jsonl + max_seq_length: 80 + max_region_num: 100 + batch_size: 64 + train_split: train + val_split: val + lr: 0.00002 + num_epoch: 20 +TASK8: + name: RetrievalCOCO + type: VL-logit + loss: CrossEntropyLoss + task_id: 8 + dataroot: data/cocoRetreival + features_h5path1: data/coco/coco_trainval_resnet101_faster_rcnn_genome.lmdb + features_h5path2: '' + train_annotations_jsonpath: data/cocoRetreival/all_data_final_train_2014.jsonline + val_annotations_jsonpath: data/cocoRetreival/all_data_final_val_set0_2014.jsonline + max_seq_length: 30 + max_region_num: 100 + batch_size: 64 + train_split: train + val_split: val + lr: 0.00002 + num_epoch: 20 +TASK9: + name: RetrievalFlickr30k + type: VL-logit + loss: CrossEntropyLoss + task_id: 9 + dataroot: data/flickr30k + features_h5path1: data/flickr30k/flickr30k_resnet101_faster_rcnn_genome.lmdb + features_h5path2: '' + train_annotations_jsonpath: data/flickr30k/all_data_final_train_2014.jsonline + val_annotations_jsonpath: data/flickr30k/all_data_final_val_set0_2014.jsonline + max_seq_length: 30 + max_region_num: 100 + batch_size: 64 + train_split: train + val_split: val + lr: 0.00002 + num_epoch: 20 +TASK10: + name: refcoco + type: V-logit + loss: BCEWithLogitLoss + task_id: 10 + dataroot: data/referExpression + features_h5path1: data/referExpression/refcoco.h5 + features_h5path2: data/referExpression/refcoco_gt.h5 + train_annotations_jsonpath: '' + val_annotations_jsonpath: '' + max_seq_length: 20 + max_region_num: 100 + batch_size: 256 + train_split: train + val_split: val + lr: 0.00004 + num_epoch: 20 +TASK11: + name: refcoco+ + type: V-logit + loss: BCEWithLogitLoss + task_id: 11 + dataroot: data/referExpression + features_h5path1: data/referExpression/refcoco+_resnet101_faster_rcnn_genome.lmdb + features_h5path2: data/referExpression/refcoco+_gt_resnet101_faster_rcnn_genome.lmdb + train_annotations_jsonpath: '' + val_annotations_jsonpath: '' + max_seq_length: 20 + max_region_num: 100 + batch_size: 256 + eval_batch_size: 1024 + train_split: train + val_split: val + lr: 0.00004 + num_epoch: 20 +TASK12: + name: refgoogle + type: V-logit + loss: BCEWithLogitLoss + task_id: 12 + dataroot: data/referExpression + features_h5path1: data/referExpression/refcoco+.h5 + features_h5path2: data/referExpression/refcoco+_gt.h5 + train_annotations_jsonpath: '' + val_annotations_jsonpath: '' + max_seq_length: 20 + max_region_num: 100 + batch_size: 256 + train_split: train + val_split: val + start_iteration: 0 + lr: 0.00004 + num_epoch: 20 +TASK13: + name: refDenseCaption + type: V-logit + loss: BCEWithLogitLoss + task_id: 13 + dataroot: data/visgenome + features_h5path1: data/referExpression/refcoco+.h5 + features_h5path2: '' + train_annotations_jsonpath: 'data/visgenome/region_descriptions.json' + val_annotations_jsonpath: 'data/visgenome/region_descriptions.json' + max_seq_length: 20 + max_region_num: 60 + batch_size: 256 + train_split: train + val_split: val +TASK14: + name: Flickr30kGround + type: L-logit + task_id: 14 + split: train +TASK15: + name: CocoCaption + type: L-classifier + task_id: 15 + split: train +TASK16: + name: Flickr30kCaption + type: L-classifier + task_id: 16 + split: train +TASK17: + name: NLVR2 + type: VL-logit + task_id: 17 + split: train \ No newline at end of file