well-classified-examples-are-underestimatedCode for the AAAI 2022 publication "Well-classified Examples are Underestimated in Classification with Deep Neural Networks"
Stars: ✭ 21 (-57.14%)
Mutual labels: transformer, graph-neural-networks
Walk-TransformerFrom Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-46.94%)
Mutual labels: transformer, graph-neural-networks
graphtransRepresenting Long-Range Context for Graph Neural Networks with Global Attention
Stars: ✭ 45 (-8.16%)
Mutual labels: transformer, graph-neural-networks
php-serializerSerialize PHP variables, including objects, in any format. Support to unserialize it too.
Stars: ✭ 47 (-4.08%)
Mutual labels: transformer
GNNLens2Visualization tool for Graph Neural Networks
Stars: ✭ 155 (+216.33%)
Mutual labels: graph-neural-networks
bytekitJava 字节操作的工具库(不是字节码的工具库)
Stars: ✭ 40 (-18.37%)
Mutual labels: transformer
capeContinuous Augmented Positional Embeddings (CAPE) implementation for PyTorch
Stars: ✭ 29 (-40.82%)
Mutual labels: transformer
SelfGNNA PyTorch implementation of "SelfGNN: Self-supervised Graph Neural Networks without explicit negative sampling" paper, which appeared in The International Workshop on Self-Supervised Learning for the Web (SSL'21) @ the Web Conference 2021 (WWW'21).
Stars: ✭ 24 (-51.02%)
Mutual labels: graph-neural-networks
ru-dalleGenerate images from texts. In Russian
Stars: ✭ 1,606 (+3177.55%)
Mutual labels: transformer
keras-vision-transformerThe Tensorflow, Keras implementation of Swin-Transformer and Swin-UNET
Stars: ✭ 91 (+85.71%)
Mutual labels: transformer
project-code-pyLeetcode using AI
Stars: ✭ 100 (+104.08%)
Mutual labels: transformer
grbGraph Robustness Benchmark: A scalable, unified, modular, and reproducible benchmark for evaluating the adversarial robustness of Graph Machine Learning.
Stars: ✭ 70 (+42.86%)
Mutual labels: graph-neural-networks
libaiLiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training
Stars: ✭ 284 (+479.59%)
Mutual labels: transformer
StoreItemDemand(117th place - Top 26%) Deep learning using Keras and Spark for the "Store Item Demand Forecasting" Kaggle competition.
Stars: ✭ 24 (-51.02%)
Mutual labels: kaggle
fastT5⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x.
Stars: ✭ 421 (+759.18%)
Mutual labels: transformer
Cross-lingual-SummarizationZero-Shot Cross-Lingual Abstractive Sentence Summarization through Teaching Generation and Attention
Stars: ✭ 28 (-42.86%)
Mutual labels: transformer
ViTs-vs-CNNs[NeurIPS 2021]: Are Transformers More Robust Than CNNs? (Pytorch implementation & checkpoints)
Stars: ✭ 145 (+195.92%)
Mutual labels: transformer
R-MeNTransformer-based Memory Networks for Knowledge Graph Embeddings (ACL 2020) (Pytorch and Tensorflow)
Stars: ✭ 74 (+51.02%)
Mutual labels: transformer