Skip to content

Coreference Model Experimentation (Tensorflow and Pytorch) : Mainly Using transfer learning and Transformer Model BERT

Notifications You must be signed in to change notification settings

ppriyank/Bert-Coref-Resolution-Lee-

Repository files navigation

Bert-Coref-Resolution-Lee-

A replicate of Official github of End-to-end Neural Coreference Resolution
(https://arxiv.org/pdf/1707.07045.pdf)
Use this for setting up the requrements and preparing glove vectors/Elmo(https://github.com/kentonl/e2e-coref)

For setting up the prince cluster to support running the scripts : follow : https://github.com/ppriyank/Prince-Set-UP

bert_end_2_end.py & train-bert_end2end.py

Replaced bert model to generate embedings at run time to replace glove vectors and elmo vectors in original paper

Since Bert works in sequences, and original code is written using sentences as chunks, the sequence is converted into run time splits of sesntences. For detailed explanation go to line #323.
For easier explanation of tensorflow code go to https://stackoverflow.com/questions/34970582/using-a-variable-for-num-splits-for-tf-split/56015552#56015552 (my own answer)

Original Span Generation

Original Span Generation

Co-reference resolution

Co-reference resolution

Multi-Tasking Appraoch

Multi-Tasking Appraoch

About

Coreference Model Experimentation (Tensorflow and Pytorch) : Mainly Using transfer learning and Transformer Model BERT

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published