Skip to content

2nd Place Solution for SemEval-2018 Task 11: Machine Comprehension using Commonsense Knowledge

License

Notifications You must be signed in to change notification settings

istvan-vincze/commonsense-rc

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Model Overview

We use attention-based LSTM networks.

For more technical details, please refer to our paper at https://arxiv.org/abs/1803.00191

Official leaderboard is available at https://competitions.codalab.org/competitions/17184#results (Evaluation Phase)

The overall model architecture is shown below:

Three-way Attentive Networks

How to run

Prerequisite

pytorch >= 0.2

spacy >= 2.0

GPU machine is preferred, training on CPU will be much slower.

Step 1:

Download preprocessed data from Google Drive or Baidu Cloud Disk, unzip and put them under folder data/.

If you choose to preprocess dataset by yourself, please preprocess official dataset by python3 src/preprocess.py, download Glove embeddings, and also remember to download ConceptNet and preprocess it with python3 src/preprocess.py conceptnet

Official dataset can be downloaded on hidrive.

We transform original XML format data to Json format with xml2json by running ./xml2json.py --pretty --strip_text -t xml2json -o test-data.json test-data.xml

Step 2:

Train model with python3 src/main.py --gpu 0, the accuracy on development set will be approximately 83% after 50 epochs.

How to reproduce our competition results

Following above instructions you will get a model with ~81.5% accuracy on test set, we use two additional techniques for our official submission (~83.95% accuracy):

  1. Pretrain our model with RACE dataset for 10 epochs.

  2. Train 9 models with different random seeds and ensemble their outputs.

About

2nd Place Solution for SemEval-2018 Task 11: Machine Comprehension using Commonsense Knowledge

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.9%
  • Shell 0.1%