All Projects → au1206 → paper_annotations

au1206 / paper_annotations

Licence: MIT license
A place to keep track of all the annotated papers.

Projects that are alternatives of or similar to paper annotations

Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+8734.38%)
Mutual labels:  paper, transfer-learning
Qlib
Qlib is an AI-oriented quantitative investment platform, which aims to realize the potential, empower the research, and create the value of AI technologies in quantitative investment. With Qlib, you can easily try your ideas to create better Quant investment strategies. An increasing number of SOTA Quant research works/papers are released in Qlib.
Stars: ✭ 7,582 (+7797.92%)
Mutual labels:  paper, research-paper
best AI papers 2021
A curated list of the latest breakthroughs in AI (in 2021) by release date with a clear video explanation, link to a more in-depth article, and code.
Stars: ✭ 2,740 (+2754.17%)
Mutual labels:  paper, research-paper
Awesome Domain Adaptation
A collection of AWESOME things about domian adaptation
Stars: ✭ 3,357 (+3396.88%)
Mutual labels:  paper, transfer-learning
favorite-research-papers
Listing my favorite research papers 📝 from different fields as I read them.
Stars: ✭ 12 (-87.5%)
Mutual labels:  transfer-learning, research-paper
Nlp Paper
NLP Paper
Stars: ✭ 484 (+404.17%)
Mutual labels:  paper, transfer-learning
paper-survey
Summary of machine learning papers
Stars: ✭ 26 (-72.92%)
Mutual labels:  paper, research-paper
Awesome Computer Vision
Awesome Resources for Advanced Computer Vision Topics
Stars: ✭ 92 (-4.17%)
Mutual labels:  paper, transfer-learning
Awesome Transfer Learning
Best transfer learning and domain adaptation resources (papers, tutorials, datasets, etc.)
Stars: ✭ 1,349 (+1305.21%)
Mutual labels:  paper, transfer-learning
PaperShell
Nice and flexible template environment for papers written in LaTeX
Stars: ✭ 117 (+21.88%)
Mutual labels:  paper, research-paper
Context-Transformer
Context-Transformer: Tackling Object Confusion for Few-Shot Detection, AAAI 2020
Stars: ✭ 89 (-7.29%)
Mutual labels:  transfer-learning
DeepCAD
code for our ICCV 2021 paper "DeepCAD: A Deep Generative Network for Computer-Aided Design Models"
Stars: ✭ 74 (-22.92%)
Mutual labels:  paper
aml-keras-image-recognition
A sample Azure Machine Learning project for Transfer Learning-based custom image recognition by utilizing Keras.
Stars: ✭ 14 (-85.42%)
Mutual labels:  transfer-learning
AutoInAgda
Proof automation – for Agda, in Agda.
Stars: ✭ 38 (-60.42%)
Mutual labels:  paper
deep-learning
Projects include the application of transfer learning to build a convolutional neural network (CNN) that identifies the artist of a painting, the building of predictive models for Bitcoin price data using Long Short-Term Memory recurrent neural networks (LSTMs) and a tutorial explaining how to build two types of neural network using as input the…
Stars: ✭ 43 (-55.21%)
Mutual labels:  transfer-learning
TransTQA
Author: Wenhao Yu ([email protected]). EMNLP'20. Transfer Learning for Technical Question Answering.
Stars: ✭ 12 (-87.5%)
Mutual labels:  transfer-learning
sioyek
Sioyek is a PDF viewer designed for reading research papers and technical books.
Stars: ✭ 3,890 (+3952.08%)
Mutual labels:  research-paper
AB distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Stars: ✭ 105 (+9.38%)
Mutual labels:  transfer-learning
WSDM2022-PTUPCDR
This is the official implementation of our paper Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR), which has been accepted by WSDM2022.
Stars: ✭ 65 (-32.29%)
Mutual labels:  transfer-learning
Hotpur
A fork of Purpur that aims to improve performance and add FabricMC compatibility.
Stars: ✭ 17 (-82.29%)
Mutual labels:  paper

A place to keep track of all the annotated research papers. The aim of this repo is to house the annotated versions of trending/impactful research papers in machine learning. According to me the skill of reading research papers is very important for beginners and experienced folks alike. With a plethora of content out there in terms of blogs and jargon and complex terms(calling back to old papers, concepts, etc) the newer folks are simply moving away from it.

In order to make paper reading more accessible, I try to annotate, give insights, try to break some jargon on the paper itself, and carefully color-coding the highlights to distinguish the work already done vs the work proposed in the paper. This is my attempt to give back to the community in the tiniest of ways :D. Hope this is helpful to the people and helps in inculcating a habit of paper reading among all.


Papers

Paper Conference Year
1. PICK : Processing Key Information Extraction from Documents using Improved Graph Learning-Convolutional Networks ICPR 2020
2. Attention is All you Need NeurIPS 2017
3. MLP-Mixer: An all MLP Architecture for Vision CVPR May 2021
4. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding NAACL 19 2018
5. EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks ICML 2019
6. EfficientNetV2: Smaller Models and Faster Training ICML 2021
7. Few-Shot Named Entity Recognition: A Comprehensive Study Dec 2020
8. RoBERTa: A Robustly Optimized BERT Pretraining Approach Jul 2019
9. LayoutLM: Pre-training of Text and Layout for Document Image Understanding KDD 20 Dec 2019
10. Fastformer: Additive Attention Can Be All You Need Sep 2021
11. LayoutLMv2: Multi-Modal Pre-Training For Visually-Rich Document Understanding ACL Sep 2021
12. WebFormer: The Web-page Transformer for Structure Information Extraction WWW Feb 2022
13. An Attention Free Transformer Sep 2021
14. DIT: SELF-SUPERVISED PRE-TRAINING FOR DOCUMENT IMAGE TRANSFORMER Mar 2022

Sample Annotations

Color Scheme

Color Meaning
Green Topics about the current paper
Yellow Topics about other relevant references
Blue Implementation details/ maths/experiments
Red Text including my thoughts, questions, and understandings

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].