Skip to content

Dataset and code for ”EATN: An Efficient Adaptive Transfer Network for Aspect-level Sentiment Analysis"

Notifications You must be signed in to change notification settings

1146976048qq/EATN

Repository files navigation

EATN

Dataset and source code for our paper: "EATN: An Efficient Adaptive Transfer Network for Aspect-level Sentiment Analysis".

Requirements

— Python 3.6

— Numpy 1.13.3

Google Word2Vec

— Transformer

— sklearn

— other pakages

To install requirements, please run pip install -r requirements.txt.

Environment

— OS: CentOS Linux release 7.7.1908

— CPU: 64 Intel(R) Xeon(R) Gold 5218 CPU @ 2.30GHz

— GPU: Four Tesla V100-SXM2 32GB

— CUDA: 10.2

Running

Prepare the Pre-trained model :

— 1. Get the BERT pre-trained model and generate the embeddings (./word2vec/get_pre_bert.sh) ;

​ — You can get the Word Embeddings through official BERT or Bert-As-Service ;

​ — Google Word2Vec ;

​ — GloVe ;

— 2. Put the pre-trained model (Google-Word2Vec/Bert) to the coresponseding path ;

Run the baseline models :

— python train_base.py --model_name xxx --dataset xxx

Run the eatn models :

— python train_eatn.py (*Default transfer task is Laptop-2-Restaurant / L2R; The parameters can be changed in the .py file!

Contact

If you have any problem about this library, please create an Issue or send us an Email at:

[email protected]

[email protected]

Citation

If the data and code are useful for your research, please be kindly to give us stars and cite our paper as follows:

@ARTICLE{9415156,
  author={Zhang, Kai and Liu, Qi and Qian, Hao and Xiang, Biao and Cui, Qing and Zhou, Jun and Chen, Enhong},
  journal={IEEE Transactions on Knowledge and Data Engineering}, 
  title={EATN: An Efficient Adaptive Transfer Network for Aspect-level Sentiment Analysis}, 
  year={2021},
  volume={},
  number={},
  pages={1-1},
  doi={10.1109/TKDE.2021.3075238}}

About

Dataset and code for ”EATN: An Efficient Adaptive Transfer Network for Aspect-level Sentiment Analysis"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published