jessicayung / Deep Learning Map
Labels
Projects that are alternatives of or similar to Deep Learning Map
deep-learning-map
Map of deep learning and notes from papers.
Math is rendered in [KaTeX-friendly version on GitLab]. ([GitHub version])
Note (13 June 2018): I am revamping the lists to structure them around more meaningful questions. The lists (previous version) can be found in the directory previous-lists
.
README contents:
- Contents of this Repo
- Vision for the Deep Learning Map
- Summaries
- Topics to cover (in the DL map)
- Recommended Deep Learning Resources
- Existing repositories with summaries of papers on machine learning
- Newsletters
- Other Resources
- Development notes
0. Contents of this Repo
- Paper summaries:
summaries/
- See 'Summaries' section in this README for details.
- Glossaries
-
glossary.md
- all works in progress
-
basics-glossary.md
- version with LaTeX equations rendered:
basics-glossary.ipynb
- version with LaTeX equations rendered:
ai-safety-glossary.md
-
- Lists
- NOTE: Many of these comprise only items I've come across in my reading since Dec 2017, so these lists don't represent my view of e.g. 'the most important datasets'. Though I try to include only items I think are significant.
- As of 13 June 2018: Many of these may be in the
previous-lists
directory. -
problem-areas
- 'artificial-general-intelligence.md`
images.md
-
tools
datasets.md
-
environments.md
: environments for training DL algorithms. hardware.md
libraries.md
-
papers-to-print.md
: Some papers I'm interested in. -
nn-components.md
: List of components of neural networks.- A first step towards drawing a 'map' of Deep Learning.
- Misc:
other-resources.md
- Implementations
-
implementations/neural-networks
: implementations of deep learning algorithms (early stages, currently have 2D MLP working)
-
1. Vision for the Deep Learning Map
The idea is to write (or link to) paper summaries, blog posts or articles that will help people (especially those starting out):
- Understand what different models or terms mean
- Know what the state-of-the-art results are in each domain
- Be able to look up known advantages and disadvantages of key models and approaches
- See how different concepts connect with each other
It is thus crucial that
- these summaries are presented in a way that makes the relationships between different concepts clear (hence this being a 'map'), and that
- materials are chosen selectively so as not to overwhelm the reader.
This is still in early stages - at the moment this is more a collection of paper summaries and definitions of terms than a map. It also contains material outside of deep learning, mostly in machine learning or neuroscience.
Let me know if you'd like to contribute or have suggestions.
2. Summaries
- Population-based training of Neural Networks (Nov 2017)
- Leave no Trace: Learning to Reset for Safe and Autonomous Reinforcement Learning (Nov 2017)
- AI Safety Gridworlds (Nov 2017)
- Concrete Problems in AI Safety (July 2016)
- Adversarial Spheres (Jan 2018)
3. Topics to cover
-
DQN
- Deep convolutional Q-learning
-
A3C (Asynchoronous Advantage Actor-Critic)
- A2C
-
Policy gradient methods
- Trust Region Policy Optimisation (TRPO)
- Proximal Policy Optimisation (PPO)
-
Hierarchical networks
- Feudal networks
-
Auxiliary tasks
- UNReAL
-
Dilated convolutions
-
Dilated LSTMs
-
Quasi-recurrent NNs
-
Hierarchical RNNs
-
Capsule Networks
-
AI Safety
4. Recommended Deep Learning Resources
See also my effective deep learning resources shortlist.
-
Notes for Stanford course CS231n on Convolutional Neural Networks
- Have found this to be great for learning and an excellent reference (often landed here in my first few months of Googling about deep learning terms/problems!)
-
Deep Learning terms glossary on WildML
- Short descriptions of terms (techniques, architectures frameworks) with links to relevant resources (blog posts, papers).
5. Existing repositories with summaries of papers on machine learning
Have added a ⭐️ to the ones I find particularly helpful.
- ⭐️ Alexander Jung
- Summaries:
- What, How, Results summary. Easy to digest.
- Bullet points with images (result graphs, architectures).
- Links to related resources (paper website, video).
- Lists by date added, with brief tags such as 'self-driving cars', 'segmentation','gan'.
- Around 80 papers added from 2016-17, last active Dec 2017.
- Starred because summaries are easy to digest and understand.
- Summaries:
- ⭐️ Denny Britz
- Summaries:
- Short TLDR; section that gives authors' argument, method and high-level findings.
- Sometimes contain key points (usually model, experiment or result details) and thoughts sections.
- Lists summaries by date reviewed, list also includes many other papers without summaries (are these key papers or ones he's read?).
- About 100 summaries. Last active Nov 2017.
- Starred because summaries are short but to the point, and because there are many summaries.
- I also recommend his weekly newsletter 'The Wild Week in AI' and his blog WildML which has some great tutorials and a deep learning glossary.
- In some way the glossary has done part of what I'd like to do with this map. Haha great!
- Oh and did I mention his repo of implementations of RL algorithms? :O I haven't even gone through this properly yet.
- Summaries:
-
Dibyatanoy Bhattacharjee (Yale)
- Summaries
- Longer summaries that outline paper content and someotimes include definitions.
- Paragraphs or bullet points with images (architecture diagrams)
- Points to Ponder sections (sometimes)
- Lists all summaries (titles AND summary content) on one webpage. Interesting because you can scroll through all of them in one go and see what catches your attention.
- 7 summaries. Last active May 2017.
- Mostly papers on RNNs / memory / translation.
- Summaries
-
Abhishek Das (CS PhD student at Georgia Tech)
- Summaries:
- Provides brief summary of model in paper. Does not seem to include results.
- Gives opinion on the strengths and weaknesses of the paper.
- Seems to be text only.
- Lists paper summaries by paper year.
- Around 40 papers from 2012-2017, Last active August 2017
- Summaries:
-
yunjey
- Summaries:
- Short, high-level summaries
- Short list of contributions of each paper.
- Sometimes includes opinions on similarities of papers to existing models.
- Lists by topic (yes!) with author, publication month/year and conference (e.g. NIPS, ICLR) if applicable.
- List is unfortunately not linked to summaries.
- 6 summaries but many more papers listed. Last active c. Dec 2016.
- Summaries:
-
Patrick Emami (CS PhD student at U of Florida)
- Summaries:
- Paragraph-based summary of paper: what authors propose, model outline, experiment outline.
- 'My Notes' section with opinions on main contributions, weaknesses/questions and areas for further research.
- Nice.
- Lists papers by topic.
- About 40 summaries. Last active August 2017.
- Summaries:
6. Newsletters
Including these here because they contain fantastic summaries of what's going on in industry and in research. They take some time to go through though.
-
ImportAI by Jack Clark
- Weekly email newsletter
- Noteworthy items with headlines and more detailed descriptions.
- May take more time to go through but the summaries are of high quality and are often structured (e.g. briefly describing testing methods, results).
- Hilarious tech fiction at the end of the newsletter. :)
-
The Wild Week in AI by Denny Britz
- Weekly email newsletter
- Sections:
- News,
- Posts, Articles, Tutorials
- Code, Projects & Data
- Links with brief summaries (and occasionally helpful context or opinions).
7. Other resources
8. Development Notes
9 Dec 2017
- Add brainstormed list of topics to cover. The idea is to write (or link to) paper summaries, good blog posts or articles that will help people with limited experience get a better idea of what is going on in the field. This means:
- Understanding what different models or terms mean
- Knowing what the state-of-the-art results are in each domain
- Being able to look up known advantages and disadvantages of key models and approaches
- Seeing how different concepts connect with each other
- Understanding what different models or terms mean
- The target audience is not currently experienced researchers, but the hope is that researchers will eventually benefit from this as well.
- I will also be going through Goodfellow et. al's book 'Deep Learning' and may add insights or summaries from the book (referencing those appropriately.)
- Difficulties: it is hard to know how to connect concepts with each other initially, so I will first
- (1) write paper summaries,
- (2) write a list of summaries of key terminology, and
- (2) build a spreadsheet trying to list connections in parallel.
- The spreadsheet is important: I believe that it will be greatly beneficial to have a visual map and not just a list of papers because the latter is much harder to digest.
- Everything will likely be scattered at first, but I hope the pieces will start coming together after the first month.