All Projects → XBTinChina → CCN_Association

XBTinChina / CCN_Association

Licence: other
认知计算神经联盟 讨论会

Projects that are alternatives of or similar to CCN Association

awesome-cogsci
An Awesome List of Cognitive Science Resources
Stars: ✭ 71 (+44.9%)
Mutual labels:  neuroscience, cognitive-science
leabra
Go implementation of Leabra algorithm for biologically-based models of cognition, based on emergent framework (with Python interface)
Stars: ✭ 38 (-22.45%)
Mutual labels:  computational-neuroscience, cognitive-science
PsyNeuLink
A block modeling system for cognitive neuroscience
Stars: ✭ 73 (+48.98%)
Mutual labels:  neuroscience, cognitive-science
BrainPy
Brain Dynamics Programming in Python
Stars: ✭ 242 (+393.88%)
Mutual labels:  neuroscience, computational-neuroscience
PyRhO
A virtual optogenetics laboratory
Stars: ✭ 30 (-38.78%)
Mutual labels:  neuroscience, computational-neuroscience
syncopy
Systems Neuroscience Computing in Python: user-friendly analysis of large-scale electrophysiology data
Stars: ✭ 19 (-61.22%)
Mutual labels:  neuroscience, computational-neuroscience
neuronunit
A package for data-driven validation of neuron and ion channel models using SciUnit
Stars: ✭ 36 (-26.53%)
Mutual labels:  neuroscience, computational-neuroscience
brian2cuda
A brian2 extension to simulate spiking neural networks on GPUs
Stars: ✭ 46 (-6.12%)
Mutual labels:  neuroscience, computational-neuroscience
Open Computational Neuroscience Resources
A publicly-editable collection of open computational neuroscience resources
Stars: ✭ 234 (+377.55%)
Mutual labels:  neuroscience
stringer-pachitariu-et-al-2018a
Recordings of 10k neurons during spontaneous behaviors
Stars: ✭ 43 (-12.24%)
Mutual labels:  neuroscience
Brayns
Visualizer for large-scale and interactive ray-tracing of neurons
Stars: ✭ 232 (+373.47%)
Mutual labels:  neuroscience
Pyphi
A toolbox for integrated information theory.
Stars: ✭ 246 (+402.04%)
Mutual labels:  neuroscience
CellExplorer
CellExplorer is a graphical user interface, a standardized processing module and data structure for exploring and classifying single cells acquired using extracellular electrodes.
Stars: ✭ 55 (+12.24%)
Mutual labels:  neuroscience
Brainiak
Brain Imaging Analysis Kit
Stars: ✭ 232 (+373.47%)
Mutual labels:  neuroscience
spikeflow
Python library for easy creation and running of spiking neural networks in tensorflow.
Stars: ✭ 30 (-38.78%)
Mutual labels:  computational-neuroscience
Awesome Computational Neuroscience
A list of schools and researchers in computational neuroscience
Stars: ✭ 230 (+369.39%)
Mutual labels:  neuroscience
Moabb
Mother of All BCI Benchmarks
Stars: ✭ 214 (+336.73%)
Mutual labels:  neuroscience
ndstore
code for storing neurodata images and image annotations
Stars: ✭ 39 (-20.41%)
Mutual labels:  neuroscience
nftsim
C++ library for simulation of multiscale neural field dynamics
Stars: ✭ 24 (-51.02%)
Mutual labels:  neuroscience
nengo-dl
Deep learning integration for Nengo
Stars: ✭ 76 (+55.1%)
Mutual labels:  neuroscience

认知计算神经联盟

讨论会日程安排

邮件群: https://groups.google.com/g/ccn_association_23 。从事认知科学、心理学、计算建模、神经网络、神经科学或脑科学的科研人员或业界人员发布就业/招聘/会议/讲座和求助等信息


  • 主持人:张洳源(https://ruyuanzhang.github.io/) 、滕相斌 (https://sites.google.com/site/xiangbinteng2/)

  • 发起成员: 宗雷、洳源、海洋、张磊、相斌。

  • 微信群: 认知计算神经联盟 (群人满了实在不好意思,我们正在考虑如何每周通知大家讨论会议号)

  • 腾讯会议号:每周日在群内公布

  • 时间: 每周周一,北京时间晚上8点开始,晚上9点半结束

  • 有兴趣的朋友,请联系[email protected]并附上微信号和个人姓名及工作单位,我们拉您进群

讨论会主题

认知科学 + 计算科学 + 神经科学 = Cognive science + Computation + Neuroscience = CCN

讲座加讨论会形式,需要大家积极参与,目的是学到东西。

  1. 每周请一位认知计算神经业内人士报告其相关工作,或者
  2. 带着大家读一篇脑科学领域内深度学习和模型相关的文章
  3. 每次讨论会一个半小时

来报告的朋友不一定要报告最新的文献,可以读读经典文献; 不一定是研究,也可以是综述和观点类的文章; 也可以讲讲自己的工作;或者介绍一个Tutorial; 或者自己要做一个正式报告(比如组会上或会议上),可以先来这里练个手。

讨论会风格

  1. 需要大家积极参与提问和讨论
  2. 不是讲座,不必客气,有问题就问
  3. 保证有MIC可以讲话,可能的话尽量开摄像头
  4. 学术面前人人平等,不要在乎长幼尊卑
  5. 保证交流是和讨论会学术相关
  6. 注意不要表达鄙视性观点,比如关于性别、种族、年龄、地域、学校、学位和国籍等
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].