All Projects → takyamamoto → BNN-ANN-papers

takyamamoto / BNN-ANN-papers

Licence: other
Papers : Biological and Artificial Neural Networks

Projects that are alternatives of or similar to BNN-ANN-papers

Sparse Evolutionary Artificial Neural Networks
Always sparse. Never dense. But never say never. A repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).
Stars: ✭ 182 (+203.33%)
Mutual labels:  artificial-neural-networks
Dino-AI
An AI to teach Google Chrome's dinosaur to jump obstacles.
Stars: ✭ 15 (-75%)
Mutual labels:  artificial-neural-networks
Deep-Learning-A-Z-Hands-on-Artificial-Neural-Network
Codes and Templates from the SuperDataScience Course
Stars: ✭ 39 (-35%)
Mutual labels:  artificial-neural-networks
Aidl kb
A Knowledge Base for the FB Group Artificial Intelligence and Deep Learning (AIDL)
Stars: ✭ 219 (+265%)
Mutual labels:  artificial-neural-networks
python-neuron
Neuron class provides LNU, QNU, RBF, MLP, MLP-ELM neurons
Stars: ✭ 38 (-36.67%)
Mutual labels:  artificial-neural-networks
pyERA
Python implementation of the Epigenetic Robotic Architecture (ERA). It includes standalone classes for Self-Organizing Maps (SOM) and Hebbian Networks.
Stars: ✭ 68 (+13.33%)
Mutual labels:  artificial-neural-networks
Cnn Svm
An Architecture Combining Convolutional Neural Network (CNN) and Linear Support Vector Machine (SVM) for Image Classification
Stars: ✭ 170 (+183.33%)
Mutual labels:  artificial-neural-networks
awesome-cogsci
An Awesome List of Cognitive Science Resources
Stars: ✭ 71 (+18.33%)
Mutual labels:  artificial-neural-networks
pyradox
State of the Art Neural Networks for Deep Learning
Stars: ✭ 61 (+1.67%)
Mutual labels:  artificial-neural-networks
neuro-symbolic-sudoku-solver
⚙️ Solving sudoku using Deep Reinforcement learning in combination with powerful symbolic representations.
Stars: ✭ 60 (+0%)
Mutual labels:  artificial-neural-networks
Echotorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on pyTorch. EchoTorch is the only Python module available to easily create Deep Reservoir Computing models.
Stars: ✭ 231 (+285%)
Mutual labels:  artificial-neural-networks
Continual Learning Benchmark
Evaluate three types of task shifting with popular continual learning algorithms.
Stars: ✭ 245 (+308.33%)
Mutual labels:  artificial-neural-networks
prometheus-spec
Censorship-resistant trustless protocols for smart contract, generic & high-load computing & machine learning on top of Bitcoin
Stars: ✭ 24 (-60%)
Mutual labels:  artificial-neural-networks
Free Ai Resources
🚀 FREE AI Resources - 🎓 Courses, 👷 Jobs, 📝 Blogs, 🔬 AI Research, and many more - for everyone!
Stars: ✭ 192 (+220%)
Mutual labels:  artificial-neural-networks
neat-python
Python implementation of the NEAT neuroevolution algorithm
Stars: ✭ 32 (-46.67%)
Mutual labels:  artificial-neural-networks
Data Science Resources
👨🏽‍🏫You can learn about what data science is and why it's important in today's modern world. Are you interested in data science?🔋
Stars: ✭ 171 (+185%)
Mutual labels:  artificial-neural-networks
Xtreme-Vision
A High Level Python Library to empower students, developers to build applications and systems enabled with computer vision capabilities.
Stars: ✭ 77 (+28.33%)
Mutual labels:  artificial-neural-networks
rl trading
No description or website provided.
Stars: ✭ 14 (-76.67%)
Mutual labels:  artificial-neural-networks
Artificial-Neural-Networks-Visualizer
Visualizing Artificial Neural Networks (ANNs) with just One Line of Code
Stars: ✭ 21 (-65%)
Mutual labels:  artificial-neural-networks
compv
Insanely fast Open Source Computer Vision library for ARM and x86 devices (Up to #50 times faster than OpenCV)
Stars: ✭ 155 (+158.33%)
Mutual labels:  artificial-neural-networks

Papers : Biological and Artificial Neural Networks

I have collected the papers of Artificial Neural Networks which related to Neuroscience (especially Computational Neuroscience). If there are papers which is not listed, I would appreciate if you could tell me from Issue.

Artificial neural networks and computational neuroscience

Survey

  • D. Cox, T. Dean. "Neural networks and neuroscience-inspired computer vision". Curr. Biol. 24(18) 921-929 (2014). (sciencedirect)
  • A. Marblestone, G. Wayne, K. Kording. "Toward an integration of deep learning and neuroscience". (2016). (arXiv)
  • O. Barak. "Recurrent neural networks as versatile tools of neuroscience research". Curr. Opin. Neurobiol. (2017). (sciencedirect)
  • D. Silva, P. Cruz, A. Gutierrez. "Are the long-short term memory and convolution neural net biological system?". KICS. 4(2), 100-106 (2018). (sciencedirect)
  • N. Kriegeskorte, P. Douglas. "Cognitive computational neuroscience". Nat. Neurosci. 21(9), 1148-1160 (2018). (arXiv)
  • N. Kriegeskorte, T. Golan. "Neural network models and deep learning - a primer for biologists". (2019). (arXiv)
  • K.R. Storrs, N. Kriegeskorte. "Deep Learning for Cognitive Neuroscience". (2019). (arXiv)
  • T.C. Kietzmann, P. McClure, N. Kriegeskorte. "Deep Neural Networks in Computational Neuroscience". Oxford Research Encyclopaedia of Neuroscience. (2019). (Oxford, bioRxiv))
  • J.S. Bowers. "Parallel Distributed Processing Theory in the Age of Deep Networks". Trends. Cogn. Sci. (2019). (sciencedirect)
  • R.M. Cichy, D. Kaiser. "Deep Neural Networks as Scientific Models". Trends. Cogn. Sci. (2019). (sciencedirect)
  • S. Musall, A.E. Urai, D. Sussillo, A.K. Churchland. "Harnessing behavioral diversity to understand neural computations for cognition". Curr. Opin. Neurobiol. (2019). (sciencedirect)
  • B.A. Richards, T.P. Lillicrap, et al. "A deep learning framework for neuroscience". Nat. Neurosci. (2019). (Nat. Neurosci.)
  • U. Hasson, S.A. Nastase, A. Goldstein. "Direct Fit to Nature: An Evolutionary Perspective on Biological and Artificial Neural Networks". Neuron. (2020). (Neuron)
  • A. Saxe, S. Nelli, C. Summerfield. "If deep learning is the answer, then what is the question?". (2020). (arXiv)

Issue

  • T.P. Lillicrap, K.P. Kording. "What does it mean to understand a neural network?". (2019). (arXiv)

Analysis methods for neural networks

Methods for understanding of neural representation of ANN.

Survey

  • D. Barrett, A. Morcos, J. Macke. "Analyzing biological and artificial neural networks: challenges with opportunities for synergy?". (2018). (arXiv)

Neuron Feature

  • I. Rafegas, M. Vanrell, L.A. Alexandre. "Understanding trained CNNs by indexing neuron selectivity". (2017). (arXiv)
  • A. Nguyen, J. Yosinski, J. Clune. "Understanding Neural Networks via Feature Visualization: A survey". (2019). (arXiv)

Comparing the representations of neural networks with those of the Brains

Representational similarity analysis (RSA)
  • N. Kriegeskorte, J. Diedrichsen. "Peeling the Onion of Brain Representations". Annu. Rev. Neurosci. (2019). (Annu Rev Neurosci)
Canonical correlation analysis (CCA)
  • M. Raghu, J. Gilmer, J. Yosinski, J. Sohl-Dickstein. "SVCCA: Singular Vector Canonical Correlation Analysis for Deep Learning Dynamics and Interpretability". NIPS. (2017). (arXiv)
  • H. Wang, et al. "Finding the needle in high-dimensional haystack: A tutorial on canonical correlation analysis". (2018). (arXiv)
Centered kernel alignment (CKA)
  • S. Kornblith, M. Norouzi, H. Lee, G. Hinton. "Similarity of Neural Network Representations Revisited". (2019). (arXiv)
Representational stability analysis (ReStA)
  • S. Abnar, L. Beinborn, R. Choenni, W. Zuidema. "Blackbox meets blackbox: Representational Similarity and Stability Analysis of Neural Language Models and Brains". (2019). (arXiv)

Fixed point analysis for RNN

  • M.B. Ottaway, P.Y. Simard, D.H. Ballard. "Fixed point analysis for recurrent networks". NIPS. (1989). (pdf)
  • D. Sussillo, O. Barak. "Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks". Neural Comput. 25(3), 626-649 (2013). (MIT Press, Jupyter notebook)
  • M.D. Golub, D. Sussillo. "FixedPointFinder: A Tensorflow toolbox for identifying and characterizing fixed points in recurrent neural networks". JOSS. (2018). (pdf, GitHub)
  • G.E. Katz, J.A. Reggia. "Using Directional Fibers to Locate Fixed Points of Recurrent Neural Networks". IEEE. (2018). (IEEE)

Ablation analysis

  • A.S. Morcos, D.G.T. Barrett, N.C. Rabinowitz, M. Botvinick. "On the importance of single directions for generalization". ICLR. (2018). (arXiv)

Computational psychiatry

I haven't been able to completely survey papers in this field.

  • R.E. Hoffman, U. Grasemann, R. Gueorguieva, D. Quinlan, D. Lane, R. Miikkulainen. "Using computational patients to evaluate illness mechanisms in schizophrenia". Biol. Psychiatry. 69(10), 997–1005 (2011). (PMC)

Deep neural network as models of the Brain

Understanding the neural representation of the brain is difficult. Neural networks learn specific tasks (or be optimized for a specific loss function), and (sometimes) can get the same representation as the brain. Then, we can indirectly know the purpose of neural representation in the brain.

Survey

  • A.J.E. Kell, J.H. McDermott. "Deep neural network models of sensory systems: windows onto the role of task constraints". Curr. Opin. Neurobiol. (2019). (sciencedirect)

Cortical neuron

  • P. Poirazi, T. Brannon, B.W Mel. "Pyramidal Neuron as Two-Layer Neural Network". Neuron. 37(6). (2003). (Neuron)
  • B. David, S. Idan, L. Michael. "Single Cortical Neurons as Deep Artificial Neural Networks". (2019). (bioRxiv)

Vision

  • D. Zipser, R.A. Andersen. "A back-propagation programmed network that simulates response properties of a subset of posterior parietal neurons". Nature. 331, 679–684 (1988). (Nature.)
  • A. Krizhevsky, I. Sutskever, G. Hinton. "ImageNet classification with deep convolutional neural networks". NIPS (2012). (pdf)
    • (cf.) I. Goodfellow, Y. Bengio, A. Courville. "Deep Learning". MIT Press. (2016) : Chapter 9.10 "The Neuroscientific Basis for ConvolutionalNetworks"
  • D. Yamins, et al. "Performance-optimized hierarchical models predict neural responses in higher visual cortex". PNAS. 111(23) 8619-8624 (2014). (PNAS)
  • S. Khaligh-Razavi, N. Kriegeskorte. "Deep supervised, but not unsupervised, models may explain IT cortical representation". PLoS Comput. Biol. 10(11), (2014). (PLOS)
  • U. Güçlü, M.A.J. van Gerven. "Deep Neural Networks Reveal a Gradient in the Complexity of Neural Representations across the Ventral Stream". J. Neurosci. 35(27), (2015). (J. Neurosci.)
  • D. Yamins, J. DiCarlo. "Eight open questions in the computational modeling of higher sensory cortex". Curr. Opin. Neurobiol. 37, 114–120 (2016). (sciencedirect)
  • K.M. Jozwik, N. Kriegeskorte, K.R. Storrs, M. Mur. "Deep Convolutional Neural Networks Outperform Feature-Based But Not Categorical Models in Explaining Object Similarity Judgments". Front. Psychol. (2017). (Front. Psychol)
  • M.N.U. Laskar, L.G.S. Giraldo, O. Schwartz. "Correspondence of Deep Neural Networks and the Brain for Visual Textures". (2018). (arXiv)
  • I. Kuzovkin, et al. "Activations of Deep Convolutional Neural Network are Aligned with Gamma Band Activity of Human Visual Cortex". Commun. Biol. 1 (2018). (Commun. Biol.)
  • M. Schrimpf, et al. "Brain-Score: Which Artificial Neural Network for Object Recognition is most Brain-Like?". (2018). (bioRxiv)
  • E. Kim, D. Hannan, G. Kenyon. "Deep Sparse Coding for Invariant Multimodal Halle Berry Neurons". CVPR. (2018). (arXiv)
  • S. Ocko, J. Lindsey, S. Ganguli, S. Deny. "The emergence of multiple retinal cell types through efficient coding of natural movies". (2018). (bioRxiv)
  • Q. Yan, et al. "Revealing Fine Structures of the Retinal Receptive Field by Deep Learning Networks". (2018). (arXiv)
  • H. Wen, J. Shi, W. Chen, Z. Liu. "Deep Residual Network Predicts Cortical Representation and Organization of Visual Features for Rapid Categorization". Sci.Rep. (2018). (Sci.Rep.)
  • J. Lindsey, S. Ocko, S. Ganguli, S. Deny. "A Unified Theory of Early Visual Representations from Retina to Cortex through Anatomically Constrained Deep CNNs". (2019). (arXiv)
  • I. Fruend. "Simple, biologically informed models, but not convolutional neural networks describe target detection in naturalistic images". bioRxiv (2019). (bioRxiv)
  • A. Doerig, et al. "Capsule Networks but not Classic CNNs Explain Global Visual Processing". (2019). (bioRxiv)
  • A.S. Benjamin, et al. "Hue tuning curves in V4 change with visual context". (2019). (bioRxiv)
  • S. Baek, M. Song, J. Jang, et al. "Spontaneous generation of face recognition in untrained deep neural networks". (2019). (bioRxiv)

Recurrent networks for object recognition

  • C. J. Spoerer, P. McClure, N. Kriegeskorte. "Recurrent Convolutional Neural Networks: A Better Model of Biological Object Recognition". Front. Psychol. (2017). (Front. Psychol)
  • A. Nayebi, D. Bear, J. Kubilius, K. Kar, S. Ganguli, D. Sussillo, J. DiCarlo, D. Yamins. "Task-Driven Convolutional Recurrent Models of the Visual System". (2018). (arXiv, GitHub)
  • T.C. Kietzmann, et al. "Recurrence required to capture the dynamic computations of the human ventral visual stream". (2019). (arXiv)
  • K. Qiao. et al. "Category decoding of visual stimuli from human brain activity using a bidirectional recurrent neural network to simulate bidirectional information flows in human visual cortices". (2019). (arXiv)
  • K. Kar, J. Kubilius, K. Schmidt, E.B. Issa, J.J. DiCarlo . "Evidence that recurrent circuits are critical to the ventral stream’s execution of core object recognition behavior". Nat. Neurosci. (2019). (Nat. Neurosci., bioRxiv)
  • T.C. Kietzmann, C.J. Spoerer, L.K.A. Sörensen, R.M. Cichy, O.Hauk, N. Kriegeskorte, "Recurrence is required to capture the representational dynamics of the human visual system". PNAS. (2019). (PNAS)

Primary visual cortex (V1)

  • S.A. Cadena, et al. "Deep convolutional models improve predictions of macaque V1 responses to natural images". PLOS Comput. Biol. (2019). (PLOS, bioRxiv)
  • A.S. Ecker, et al. "A rotation-equivariant convolutional neural network model of primary visual cortex". ICLR (2019). (OpenReview, arXiv)

Visual illusion

Also see the papers associated with PredNet.

  • E.J. Ward. "Exploring Perceptual Illusions in Deep Neural Networks". (2019). (bioRxiv)
  • E.D. Sun, R. Dekel."ImageNet-trained deep neural network exhibits illusion-like response to the Scintillating Grid". (2019). (arXiv)

Recursive Cortical Network (RCN; non NN model)

  • D. George, et al. "A generative vision model that trains with high data efficiency and breaks text-based CAPTCHAs". Science (2017). (Science, GitHub)

Weight shared ResNet as RNN for object recognition

  • Q. Liao, T. Poggio. "Bridging the Gaps Between Residual Learning, Recurrent Neural Networks and Visual Cortex". (2016). (arXiv)

Generating visual super stimuli

  • J. Ukita, T. Yoshida, K. Ohki. "Characterisation of nonlinear receptive fields of visual neurons by convolutional neural network". Sci.Rep. (2019). (Sci.Rep.)
  • C.R. Ponce, et al. "Evolving super stimuli for real neurons using deep generative networks". Cell. 177, 999–1009 (2019). (bioRxiv, Cell)
  • P. Bashivan, K. Kar, J.J DiCarlo. "Neural Population Control via Deep Image Synthesis". Science. (2019). (bioRxiv, Science, GitHub1, GitHub2)
  • A.P. Batista. K.P. Kording. "A Deep Dive to Illuminate V4 Neurons". Trends. Cogn. Sci. (2019). (Trends. Cogn. Sci.)

Visual number sense

  • K. Nasr, P. Viswanathan, A. Nieder. "Number detectors spontaneously emerge in a deep neural network designed for visual object recognition". Sci. Adv. (2019). (Sci. Adv.)

Auditory cortex

  • U. Güçlü, J. Thielen, M. Hanke, M. van Gerven. "Brains on Beats". NIPS (2016) (arXiv)
  • A.J.E. Kell,D.L.K. Yamins,E.N. Shook, S.V. Norman-Haignere, J.H.McDermott. "A Task-Optimized Neural Network Replicates Human Auditory Behavior, Predicts Brain Responses, and Reveals a Cortical Processing Hierarchy". Neuron 98(3), (2018) (sciencedirect)
  • T. Koumura, H. Terashima, S. Furukawa. "Cascaded Tuning to Amplitude Modulation for Natural Sound Recognition". J. Neurosci. 39(28), 5517-5533 (2019). (J. Neurosci., bioRxiv, GitHub)

Motor cortex

  • D. Sussillo, M. Churchland, M. Kaufman, K. Shenoy. "A neural network that finds a naturalistic solution for the production of muscle activity". Nat. Neurosci. 18(7), 1025–1033 (2015). (PubMed)
  • J.A. Michaels, et al. "A neural network model of flexible grasp movement generation". (2019). (bioRxiv)
  • J. Merel, M. Botvinick, G. Wayne. "Hierarchical motor control in mammals and machines". Nat. Commun. (2019). (Nat.Commun.)

Spatial coding (Place cells, Grid cells, Head direction cells)

  • C. Cueva, X. Wei. "Emergence of grid-like representations by training recurrent neural networks to perform spatial localization". ICLR. (2018). (arXiv)
  • A. Banino, et al. "Vector-based navigation using grid-like representations in artificial agents". Nature. 557(7705), 429–433 (2018). (pdf, GitHub)
  • J.C.R. Whittington. et al. "Generalisation of structural knowledge in the hippocampal-entorhinal system". NIPS. (2018). (arXiv)
  • C.J. Cueva, P.Y. Wang, M. Chin, X. Wei. "Emergence of functional and structural properties of the head direction system by optimization of recurrent neural networks". (2020). (arXiv)

Rodent barrel cortex

  • C. Zhuang, J. Kubilius, M. Hartmann, D. Yamins. "Toward Goal-Driven Neural Network Models for the Rodent Whisker-Trigeminal System". NIPS. (2017). (arXiv)

Convergent Temperature Representations

  • M. Haesemeyer, A. Schier, F. Engert. "Convergent temperature representations in artificial and biological neural networks". Neuron. (2019). (bioRxiv), (Neuron)

Cognitive task

  • H.F. Song, G.R. Yang, X.J. Wang. "Reward-based training of recurrent neural networks for cognitive and value-based tasks". eLife. 6 (2017). (eLife)
  • G.R. Yang, M.R. Joglekar, H.F. Song, W.T. Newsome, X.J. Wang. "Task representations in neural networks trained to perform many cognitive tasks". Nat. Neurosci. (2019). (Nat. Neurosci.) (GitHub)

Time perception

  • N.F. Hardy, V. Goudar, J.L. Romero-Sosa, D.V. Buonomano. "A model of temporal scaling correctly predicts that motor timing improves with speed". Nat. Commun. 9 (2018). (Nat. Commun.)
  • J. Wang, D. Narain, E.A. Hosseini, M. Jazayeri. "Flexible timing by temporal scaling of cortical responses". Nat. Neurosci. 21 102–110(2018). (Nat. Neurosci.)
  • W. Roseboom, Z. Fountas, K. Nikiforou, D. Bhowmik, M. Shanahan, A. K. Seth. "Activity in perceptual classification networks as a basis for human subjective time perception". Nat. Commun. 10 (2019). (Nat. Commun.)
  • B. Deverett, et al. "Interval timing in deep reinforcement learning agents". NeurIPS 2019. (2019). (arXiv)
  • Z. Bi, C. Zhou. "Time representation in neural network models trained to perform interval timing tasks". (2019). (arXiv).

Short-term memory task

  • K. Rajan, C.D.Harvey, D.W.Tank. "Recurrent Network Models of Sequence Generation and Memory". Neuron. 90(1), 128-142 (2016). (sciencedirect)
  • A.E. Orhan, W.J. Ma. " A diverse range of factors affect the nature of neural representations underlying short-term memory". Nat. Neurosci. (2019). (Nat. Neurosci.), (bioRxiv), (GitHub)
  • N.Y. Masse. et al. "Circuit mechanisms for the maintenance and manipulation of information in working memory". Nat. Neurosci. (2019). (Nat. Neurosci.), (bioRxiv)

Language

  • J. Chiang, et al. "Neural and computational mechanisms of analogical reasoning". (2019). (bioRxiv)
  • S. Na, Y.J. Choe, D. Lee, G. Kim. "Discovery of Natural Language Concepts in Individual Units of CNNs". ICLR. (2019). (OpenReview), (arXiv)

Language learning

  • B.M. Lake, T. Linzen, M. Baroni. "Human few-shot learning of compositional instructions". (2019). (arXiv)
  • A. Alamia, V. Gauducheau, D. Paisios, R. VanRullen. "Which Neural Network Architecture matches Human Behavior in Artificial Grammar Learning?". (2019). (arXiv)

Neural network architecture based on neuroscience

Survey

  • D. Hassabis, D. Kumaran, C. Summerfield, M. Botvinick. "Neuroscience-Inspired Artificial Intelligence". Neuron. 95(2), 245-258 (2017). (sciencedirect)

PredNet (Deep predictive coding network)

  • W. Lotter, G. Kreiman, D. Cox. "Deep predictive coding networks for video prediction and unsupervised learning". ICLR. (2017). (arXiv, GitHub)
  • E. Watanabe, A. Kitaoka, K. Sakamoto, M. Yasugi, K. Tanaka. "Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction". Front. Psychol. (2018). (Front. Psychol.)
  • M. Fonseca. "Unsupervised predictive coding models may explain visual brain representation". (2019). (arXiv, GitHub)
  • W. Lotter, G. Kreiman, D. Cox. "A neural network trained to predict future video frames mimics critical properties of biological neuronal responses and perception". Nat. Machine Intelligence. (2020). (arXiv, Nat. Machine Intelligence)

subLSTM

  • R. Costa, Y. Assael, B. Shillingford, N. Freitas, T. Vogels. "Cortical microcircuits as gated-recurrent neural networks". NIPS. (2017). (arXiv)

Activation functions

  • G.S. Bhumbra. "Deep learning improved by biological activation functions". (2018). (arXiv)

Normalization

  • L. Gonzalo, S. Giraldo, O. Schwartz. "Integrating Flexible Normalization into Mid-Level Representations of Deep Convolutional Neural Networks". (2018). (arXiv)
  • M.F. Günthner, et al. "Learning Divisive Normalization in Primary Visual Cortex". (2019). (bioRxiv)

Reinforcement Learning

I haven't been able to completely survey papers in this field.

  • N. Haber, D. Mrowca, L. Fei-Fei, D. Yamins. "Learning to Play with Intrinsically-Motivated Self-Aware Agents". NIPS. (2018). (arXiv)
  • J. X. Wang, et al. "Prefrontal cortex as a meta-reinforcement learning system". Nat. Neurosci. (2018). (Nat. Neurosci.), (bioRxiv), (blog)
  • M. Botvinick. et al. "Reinforcement Learning, Fast and Slow". Trends. Cogn. Sci. (2019). (Trends. Cogn. Sci.)
  • E.O. Neftci, B.B. Averbeck. "Reinforcement learning in artificial and biological systems". Nat. Mach. Intell. (2019). (Nat. Mach. Intell.)
  • W. Dabney, Z. Kurth-Nelson, N. Uchida, C.K. Starkweather, D. Hassabis, R. Munos, & M. Botvinick. "A distributional code for value in dopamine-based reinforcement learning". Nature. (2020). (Nature). (blog)

Learning and development

Biologically plausible learning algorithms

Survey

  • J. Whittington, R. Bogacz. "Theories of Error Back-Propagation in the Brain". Trends. Cogn. Sci. (2019). (sciencedirect)
  • T.P. Lillicrap, A.Santoro. "Backpropagation through time and the brain". Curr. Opin. Neurobiol. (2019). (sciencedirect)
  • T.P. Lillicrap, A. Santoro, L. Marris, et al. "Backpropagation and the brain". Nat. Rev. Neurosci. (2020). (Nat. Rev. Neurosci.)

Equilibrium Propagation

  • Y. Bengio, D. Lee, J. Bornschein, T. Mesnard, Z. Lin. "Towards Biologically Plausible Deep Learning". (2015). (arXiv)
  • B. Scellier, Y. Bengio. "Equilibrium Propagation: Bridging the Gap Between Energy-Based Models and Backpropagation". Front. Comput. Neurosci. 11(24), (2017). (arXiv)
  • J. Sacramento, R. P. Costa, Y. Bengio, W. Senn. "Dendritic cortical microcircuits approximate the backpropagation algorithm". NIPS. (2018). (arXiv)

Feedback alignment

  • T. Lillicrap, D. Cownden, D. Tweed, C. Akerman. "Random synaptic feedback weights support error backpropagation for deep learning". Nat. Commun. 7 (2016). (Nat. Commun.)
  • A. Nøkland. "Direct Feedback Alignment Provides Learning in Deep Neural Networks". (2016). (arXiv)
  • M. Akrout, C. Wilson, P.C. Humphreys, T.Lillicrap, D. Tweed. "Deep Learning without Weight Transport". (2019). (arXiv)
  • B.J. Lansdell, P. Prakash, K.P. Kording. "Learning to solve the credit assignment problem". (2019). (arXiv)

Local error signal

  • H. Mostafa, V. Ramesh, G.Cauwenberghs. "Deep Supervised Learning Using Local Errors". Front. Neurosci. (2018). (Front. Neurosci.).
  • A. Nøkland, L.H. Eidnes. "Training Neural Networks with Local Error Signals". (2019). (arXiv) (GitHub)

Others

  • M. Jaderberg, et al. "Decoupled Neural Interfaces using Synthetic Gradients" (2016). (arXiv)
  • N. Ke, A. Goyal, O. Bilaniuk, J. Binas, M. Mozer, C. Pal, Y. Bengio. "Sparse Attentive Backtracking: Temporal CreditAssignment Through Reminding". NIPS. (2018). (arXiv)
  • S. Bartunov, A. Santoro, B. Richards, L. Marris, G. Hinton, T. Lillicrap. "Assessing the Scalability of Biologically-Motivated Deep Learning Algorithms and Architectures". NIPS. (2018). (arXiv)
  • R. Feldesh. "The Distributed Engram". (2019). (bioRxiv)
  • Y. Amit. "Deep Learning With Asymmetric Connections and Hebbian Updates". Front. Comput. Neurosci. (2019). (Front. Comput. Neurosci.). (GitHub)
  • T. Mesnard, G. Vignoud, J. Sacramento, W. Senn, Y. Bengio "Ghost Units Yield Biologically Plausible Backprop in Deep Neural Networks". (2019). (arXiv)

Issue

  • F. Crick. "The recent excitement about neural networks". Nature. 337, 129–132 (1989). (Nat.)

Learning dynamics of neural networks and brains

  • J. Shen, M. D. Petkova, F. Liu, C. Tang. "Toward deciphering developmental patterning with deep neural network". (2018). (bioRxiv)
  • A.M. Saxe, J.L. McClelland, S. Ganguli. "A mathematical theory of semantic development in deep neural networks". PNAS. (2019). (arXiv). (PNAS)
  • D.V. Raman, A.P. Rotondo, T. O’Leary. "Fundamental bounds on learning performance in neural circuits". PNAS. (2019). (PNAS)
  • R. C. Wilson, A. Shenhav, M. Straccia, J.D. Cohen. "The Eighty Five Percent Rule for optimal learning". Nat. Commun. (2019). (Nat.Commun.)

Few shot Learning

  • A. Cortese, B.D. Martino, M. Kawato. "The neural and cognitive architecture for learning from a small sample". Curr. Opin. Neurobiol. 55, 133–141 (2019). (sciencedirect)

A Critique of Pure Learning

  • A. Zador. "A Critique of Pure Learning: What Artificial Neural Networks can Learn from Animal Brains". Nat. Commun.(2019). (bioRxiv). (Nat. Commun.)

Brain Decoding & Brain-machine interface

  • E. Matsuo, I. Kobayashi, S. Nishimoto, S. Nishida, H. Asoh. "Generating Natural Language Descriptions for Semantic Representations of Human Brain Activity". ACL SRW. (2016). (ACL Anthology)
  • Y. Güçlütürk, U. Güçlü, K. Seeliger, S.E.Bosch, R.J. van Lier, M.A.J. van Gerven. "Reconstructing perceived faces from brain activations with deep adversarial neural decoding". NIPS (2017). (NIPS)
  • R. Rao. "Towards Neural Co-Processors for the Brain: Combining Decoding and Encoding in Brain-Computer Interfaces". (2018). (arXiv)
  • G. Shen, T. Horikawa, K. Majima, Y. Kamitani. "Deep image reconstruction from human brain activity". PLOS (2019). (PLOS)

Others

  • M.S. Goldman. "Memory without Feedback in a Neural Network". Neuron (2009). (sciencedirect)
  • R. Yuste. "From the neuron doctrine to neural networks". Nat. Rev. Neurosci. 16, 487–497 (2015). (Nat. Rev. Neurosci.)
  • S. Saxena, J.P. Cunningham. "Towards the neural population doctrine". Curr. Opin. Neurobiol. (2019). (sciencedirect)
  • D.J. Heeger. "Theory of cortical function". PNAS. 114(8), (2017). (PNAS)
  • C.C. Chow, Y. Karimipanah. "Before and beyond the Wilson-Cowan equations". (2019). (arXiv)
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].