GNN-Recommender-SystemsAn index of recommendation algorithms that are based on Graph Neural Networks.
Stars: ✭ 505 (+1.41%)
3DInfomaxMaking self-supervised learning work on molecules by using their 3D geometry to pre-train GNNs. Implemented in DGL and Pytorch Geometric.
Stars: ✭ 107 (-78.51%)
gnn-lspeSource code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (-66.87%)
SubGNNSubgraph Neural Networks (NeurIPS 2020)
Stars: ✭ 136 (-72.69%)
GRACE[GRL+ @ ICML 2020] PyTorch implementation for "Deep Graph Contrastive Representation Learning" (https://arxiv.org/abs/2006.04131v2)
Stars: ✭ 144 (-71.08%)
EulerA distributed graph deep learning framework.
Stars: ✭ 2,701 (+442.37%)
GCLList of Publications in Graph Contrastive Learning
Stars: ✭ 25 (-94.98%)
GNNLens2Visualization tool for Graph Neural Networks
Stars: ✭ 155 (-68.88%)
BGCNA Tensorflow implementation of "Bayesian Graph Convolutional Neural Networks" (AAAI 2019).
Stars: ✭ 129 (-74.1%)
SAN[ECCV 2020] Scale Adaptive Network: Learning to Learn Parameterized Classification Networks for Scalable Input Images
Stars: ✭ 41 (-91.77%)
SimP-GCNImplementation of the WSDM 2021 paper "Node Similarity Preserving Graph Convolutional Networks"
Stars: ✭ 43 (-91.37%)
grailInductive relation prediction by subgraph reasoning, ICML'20
Stars: ✭ 83 (-83.33%)
mtad-gat-pytorchPyTorch implementation of MTAD-GAT (Multivariate Time-Series Anomaly Detection via Graph Attention Networks) by Zhao et. al (2020, https://arxiv.org/abs/2009.02040).
Stars: ✭ 85 (-82.93%)
Pytorch geometricGraph Neural Network Library for PyTorch
Stars: ✭ 13,359 (+2582.53%)
DiGCLThe PyTorch implementation of Directed Graph Contrastive Learning (DiGCL), NeurIPS-2021
Stars: ✭ 27 (-94.58%)
QGNNQuaternion Graph Neural Networks (ACML 2021) (Pytorch and Tensorflow)
Stars: ✭ 31 (-93.78%)
DCGCNDensely Connected Graph Convolutional Networks for Graph-to-Sequence Learning (authors' MXNet implementation for the TACL19 paper)
Stars: ✭ 73 (-85.34%)
gemnet pytorchGemNet model in PyTorch, as proposed in "GemNet: Universal Directional Graph Neural Networks for Molecules" (NeurIPS 2021)
Stars: ✭ 80 (-83.94%)
ProteinGCNProteinGCN: Protein model quality assessment using Graph Convolutional Networks
Stars: ✭ 88 (-82.33%)
StellargraphStellarGraph - Machine Learning on Graphs
Stars: ✭ 2,235 (+348.8%)
Awesome Graph ClassificationA collection of important graph embedding, classification and representation learning papers with implementations.
Stars: ✭ 4,309 (+765.26%)
GCA[WWW 2021] Source code for "Graph Contrastive Learning with Adaptive Augmentation"
Stars: ✭ 69 (-86.14%)
SelfGNNA PyTorch implementation of "SelfGNN: Self-supervised Graph Neural Networks without explicit negative sampling" paper, which appeared in The International Workshop on Self-Supervised Learning for the Web (SSL'21) @ the Web Conference 2021 (WWW'21).
Stars: ✭ 24 (-95.18%)
kdtfKnowledge Distillation using Tensorflow
Stars: ✭ 139 (-72.09%)
MetaD2AOfficial PyTorch implementation of "Rapid Neural Architecture Search by Learning to Generate Graphs from Datasets" (ICLR 2021)
Stars: ✭ 49 (-90.16%)
Pretrained Language ModelPretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Stars: ✭ 2,033 (+308.23%)
RSC-NetImplementation for "3D human pose, shape and texture from low-resolution images and videos", TPAMI 2021
Stars: ✭ 43 (-91.37%)
how attentive are gatsCode for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
Stars: ✭ 200 (-59.84%)
ACCV TinyGANBigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression
Stars: ✭ 62 (-87.55%)
FGDFocal and Global Knowledge Distillation for Detectors (CVPR 2022)
Stars: ✭ 124 (-75.1%)
ntds 2019Material for the EPFL master course "A Network Tour of Data Science", edition 2019.
Stars: ✭ 62 (-87.55%)
L2-GCN[CVPR 2020] L2-GCN: Layer-Wise and Learned Efficient Training of Graph Convolutional Networks
Stars: ✭ 26 (-94.78%)
RioGNNReinforced Neighborhood Selection Guided Multi-Relational Graph Neural Networks
Stars: ✭ 46 (-90.76%)
ProSelfLC-2021noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
Stars: ✭ 45 (-90.96%)
SemCKDThis is the official implementation for the AAAI-2021 paper (Cross-Layer Distillation with Semantic Calibration).
Stars: ✭ 42 (-91.57%)
distill-and-selectAuthors official PyTorch implementation of the "DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval" [IJCV 2022]
Stars: ✭ 43 (-91.37%)
head-network-distillation[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"
Stars: ✭ 27 (-94.58%)
FKDA Fast Knowledge Distillation Framework for Visual Recognition
Stars: ✭ 49 (-90.16%)
bert-AADAdversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
Stars: ✭ 27 (-94.58%)
mmrazorOpenMMLab Model Compression Toolbox and Benchmark.
Stars: ✭ 644 (+29.32%)
GalaXCGalaXC: Graph Neural Networks with Labelwise Attention for Extreme Classification
Stars: ✭ 28 (-94.38%)
Revisiting-Contrastive-SSLRevisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (-83.73%)
MoTISMobile(iOS) Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP). Accepted at NAACL 2022.
Stars: ✭ 60 (-87.95%)
AB distillationKnowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Stars: ✭ 105 (-78.92%)