A PyTorch Implementation of "Watch Your Step: Learning Node Embeddings via Graph Attention" (NeurIPS 2018).
GPL-3.0 License
A PyTorch Implementation of Watch Your Step: Learning Node Embeddings via Graph Attention (NIPS 2018).
This repository provides an implementation of Attention Walk as described in the paper:
Watch Your Step: Learning Node Embeddings via Graph Attention. Sami Abu-El-Haija, Bryan Perozzi, Rami Al-Rfou, Alexander A. Alemi. NIPS, 2018. [Paper]
The original Tensorflow implementation is available [here].
The codebase is implemented in Python 3.5.2. package versions used for development are just below.
networkx 2.4
tqdm 4.28.1
numpy 1.15.4
pandas 0.23.4
texttable 1.5.0
scipy 1.1.0
argparse 1.1.0
torch 1.1.0
torchvision 0.3.0
Learning of the embedding is handled by the src/main.py
script which provides the following command line arguments.
--edge-path STR Input graph path. Default is `input/chameleon_edges.csv`.
--embedding-path STR Embedding path. Default is `output/chameleon_AW_embedding.csv`.
--attention-path STR Attention path. Default is `output/chameleon_AW_attention.csv`.
--dimensions INT Number of embeding dimensions. Default is 128.
--epochs INT Number of training epochs. Default is 200.
--window-size INT Skip-gram window size. Default is 5.
--learning-rate FLOAT Learning rate value. Default is 0.01.
--beta FLOAT Attention regularization parameter. Default is 0.5.
--gamma FLOAT Embedding regularization parameter. Default is 0.5.
--num-of-walks INT Number of walks per source node. Default is 80.
Creating an Attention Walk embedding of the default dataset with 256 dimensions.
python src/main.py --dimensions 256
Creating an Attention Walk embedding of the default dataset with a higher window size.
python src/main.py --window-size 20
Creating an embedding of another dataset the Twitch Brasilians
. Saving the outputs under custom file names.
python src/main.py --edge-path input/ptbr_edges.csv --embedding-path output/ptbr_AW_embedding.csv --attention-path output/ptbr_AW_attention.csv
License