Skip to content
/ TIP Public

TIP: Tri-graph Interaction Propagation model for Polypharmacy Side Effect Prediction (GRL@NeurIPS, 2019)

License

Notifications You must be signed in to change notification settings

NYXFLOWER/TIP

Repository files navigation

Tri-graph Information Propagation (TIP) model

TIP is an efficient general approach for multi-relational link prediction in any multi-modal (i.e. heterogeneous and multi-relational) network with two types of nodes. It can also be applied to the Knowledge Graph Completion and Recommendation task. TIP model is inspired by the Decagon and R-GCN models, motivated by their limitations of high computational cost and memory demand when graph goes really complex. TIP improves their link prediction accuracy, and time and space efficiency of node representation learning. See details on the algorithm in our paper (Xu, Sang, and Lu, 2019).

TIP for Polypharmacy Side Effect Prediction

we are particularly concerned about the safety of polypharmacy, which is the concurrent use of multiple medications by a patient. Given a pair of drugs (:pill:,:pill:), the TIP model will predict how many polypharmacy side effects the drug pair will have, and what are the possibilities.

We use POSE clinical records and pharmacological information to construct a multi-modal biomedical graph with two types of nodes: Drug (D) and Protein (P). The graph contains three types of interaction (refer to three subgraphs):

  🍪   D-D graph: drug-drug interactions with side effects as edge labels

  🍰   P-D graph: protein-drug interactions (with a fixed label)

  🍨   P-P graph: protein-protein interactions (with a fixed label)

TIP model embeds proteins and drugs into different spaces of possibly different dimensions in the encoder, and predict side effects of drug combinations in the decoder. As shown below, TIP learns the protein embedding firstly on the P-P graph, and passes it to D-D graph via D-P graph. On D-D graph, TIP learns drug embedding and predicts relationships between drugs.

TIP Encoder:

TIP Decoder:

Source Code

TIP is implemented in PyTorch with PyG package. It is developed and tested under Python 3.

Requirement

You can install the pytorch and pyg packages with the versions that matches your hardware, or use the same environment as mine using the following commands:

$ conda create -n tip-gpu python==3.9
$ conda install pytorch==1.8.0 torchvision==0.9.0 torchaudio==0.8.0 cudatoolkit=10.2 -c pytorch
$ conda install pyg==2.0.1 -c pyg -c conda-forge	

(Optional) If you are interested in monitoring GPU memory usage of the model, the pytorch_memlab package is helpful.

$ pip install pytorch_memlab

(Optional) TIP is trained and tested on a single GPU. If you are interested in training TIP using multiple GPUs, pytorch_lightning would be helpful.

Running

The processed data and the code for data processing are in the ./data/ folder. The raw datasets are available on the BioSNAP. See ./data.ipynb for the full polypharmacy datasets analysis and data preprocessing.

Step 1: preparing data. Run it once to generate a data_dict.pkl file in ./data/ folder).

python prepare.py			

Step 2: training and testing model. The default model is TIP-cat. If you want to train and test a TIP-add model, change the value of variable MOD from 'cat' to 'add'.

python tip.py

By following the above steps and using the default hyper-parameter settings, the results that are shown in the TIP paper (Xu, Sang, and Lu, 2019) can be reproduced.

🌚🌒🌓🌔 Please browse/open issues should you have any questions or ideas​ 🌖🌗🌘🌚

Cite Us

If you found this work useful, please cite us:

@article{xu2019tip,
	title={Tri-graph Information Propagation for Polypharmacy Side Effect Prediction},
	author={Hao Xu and Shengqi Sang and Haiping Lu},
	journal={NeurIPS Workshop on Graph Representation Learning},
	year={2019}
}

License

TIP is licensed under the MIT License.

Contributors ✨

Thanks goes to these wonderful people (emoji key):


sangsq

💻 ⚠️ 🤔 📖

Haiping Lu

📖 ️️️️♿️

Shreeyash

🐛

Chertoganov

🐛

ZillaRU

🐛

Jiaxi Jiang

🐛

This project follows the all-contributors specification. Contributions of any kind welcome!