The Living Thing / Notebooks :

Brains

Neural networks made of real neurons, in functioning brains

Usefulness: 🔧
Novelty: 💡
Uncertainty: 🤪 🤪 🤪
Incompleteness: 🚧 🚧 🚧

How do brains work?

Dr. Greg Dunn and Dr. Brian Edwards, Self Reflected:

Brain, slightly stylized.

I mean, how do brains work at the level slightly higher than a synapse, but much lower than, e.g. psychology. “How is thought done?” etc.

Notes pertaining to large, artificial networks are filed under artificial neural networks. The messy, biological end of the stick is here. Since brains seem to be the seat of the most flashy and important bit of the computing taking place in our bodies, we understandably want to know how they works, in order to

Real brains are different to the “neuron-inspired” computation of the simulacrum in very many ways, not just the usual difference between model and reality. The similitude between “neural networks” and neurons is intentionally weak for reasons of convenience.

For one example, most simulated neural networks are based on a continuous activation potential and discrete time, unlike spiking biological ones which are driven by discrete events in continuous time.

Also real brains support heterogeneous types of neuron, have messier layer organisation, use less power, don’t have well-defined backpropagation (or not in the same way), and many other things that I as a non-specialist do not know.

To learn more about:

Fun data

Allen Institute Brain Observatory.

Refs

Amigó, José M, Janusz Szczepański, Elek Wajnryb, and Maria V Sanchez-Vives. 2004. “Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity.” Neural Computation 16: 717–36. https://doi.org/10.1162/089976604322860677.

Barbieri, Riccardo, Michael C Quirk, Loren M Frank, Matthew A Wilson, and Emery N Brown. 2001. “Construction and Analysis of Non-Poisson Stimulus-Response Models of Neural Spiking Activity.” Journal of Neuroscience Methods 105 (1): 25–37. https://doi.org/10.1016/S0165-0270(00)00344-7.

Berwick, Robert C., Kazuo Okanoya, Gabriel J. L. Beckers, and Johan J. Bolhuis. 2011. “Songs to Syntax: The Linguistics of Birdsong.” Trends in Cognitive Sciences 15 (3): 113–21. https://doi.org/10.1016/j.tics.2011.01.002.

Brette, Romain. 2008. “Generation of Correlated Spike Trains.” Neural Computation 0 (0): 080804143617793–28. https://doi.org/10.1162/neco.2008.12-07-657.

———. 2012. “Computing with Neural Synchrony.” PLoS Comput Biol 8 (6): e1002561. https://doi.org/10.1371/journal.pcbi.1002561.

Brown, ERVRL, Riccardo Barbieri, Valérie Ventura, R. Kass, and L. Frank. 2002. “The Time-Rescaling Theorem and Its Application to Neural Spike Train Data Analysis.” Neural Computation 14 (2): 325–46. https://doi.org/10.1162/08997660252741149.

Buhusi, Catalin V., and Warren H. Meck. 2005. “What Makes Us Tick? Functional and Neural Mechanisms of Interval Timing.” Nature Reviews Neuroscience 6 (10): 755–65. https://doi.org/10.1038/nrn1764.

Cadieu, C. F. 2014. “Deep Neural Networks Rival the Representation of Primate It Cortex for Core Visual Object Recognition.” PLoS Comp. Biol. 10: e1003963. https://doi.org/10.1371/journal.pcbi.1003963.

Eden, U, L Frank, R Barbieri, V Solo, and E Brown. 2004. “Dynamic Analysis of Neural Encoding by Point Process Adaptive Filtering.” Neural Computation 16 (5): 971–98. https://doi.org/10.1162/089976604773135069.

Elman, Jeffrey L. 1990. “Finding Structure in Time.” Cognitive Science 14: 179–211. https://doi.org/10.1016/0364-0213(90)90002-E.

———. 1993. “Learning and Development in Neural Networks: The Importance of Starting Small.” Cognition 48: 71–99. https://doi.org/10.1016/0010-0277(93)90058-4.

Fee, Michale S, Alexay A Kozhevnikov, and Richard H Hahnloser. 2004. “Neural Mechanisms of Vocal Sequence Generation in the Songbird.” Annals of the New York Academy of Sciences 1016: 153–70. https://doi.org/10.1196/annals.1298.022.

Fernández, Pau, and Ricard V Solé. 2007. “Neutral Fitness Landscapes in Signalling Networks.” Journal of the Royal Society Interface 4 (12): 41. https://doi.org/10.1098/rsif.2006.0152.

Haslinger, Robert, Gordon Pipa, and Emery Brown. 2010. “Discrete Time Rescaling Theorem: Determining Goodness of Fit for Discrete Time Statistical Models of Neural Spiking.” Neural Computation 22 (10): 2477–2506. https://doi.org/10.1162/NECO_a_00015.

Jin, Dezhe Z. 2009. “Generating Variable Birdsong Syllable Sequences with Branching Chain Networks in Avian Premotor Nucleus HVC.” Physical Review E 80 (5): 051902. https://doi.org/10.1103/PhysRevE.80.051902.

Jin, Dezhe Z, and Alexay A Kozhevnikov. 2011. “A Compact Statistical Model of the Song Syntax in Bengalese Finch.” PLoS Comput Biol 7 (3): –1001108. https://doi.org/10.1371/journal.pcbi.1001108.

Kass, Robert E., Shun-Ichi Amari, Kensuke Arai, Emery N. Brown, Casey O. Diekman, Markus Diesmann, Brent Doiron, et al. 2018. “Computational Neuroscience: Mathematical and Statistical Perspectives.” Annual Review of Statistics and Its Application 5 (1): 183–214. https://doi.org/10.1146/annurev-statistics-041715-033733.

Katahira, Kentaro, Kenta Suzuki, Kazuo Okanoya, and Masato Okada. 2011. “Complex Sequencing Rules of Birdsong Can Be Explained by Simple Hidden Markov Processes.” PLoS ONE 6 (9): –24516. https://doi.org/10.1371/journal.pone.0024516.

Kutschireiter, Anna, Simone Carlo Surace, Henning Sprekeler, and Jean-Pascal Pfister. 2015a. “A Neural Implementation for Nonlinear Filtering.” arXiv Preprint arXiv:1508.06818.

Kutschireiter, Anna, Simone C Surace, Henning Sprekeler, and Jean-Pascal Pfister. 2015b. “Approximate Nonlinear Filtering with a Recurrent Neural Network.” BMC Neuroscience 16 (Suppl 1): P196. https://doi.org/10.1186/1471-2202-16-S1-P196.

Lee, Honglak, Alexis Battle, Rajat Raina, and Andrew Y. Ng. 2007. “Efficient Sparse Coding Algorithms.” Advances in Neural Information Processing Systems 19: 801. https://papers.nips.cc/paper/2979-efficient-sparse-coding-algorithms.pdf.

Marcus, Gary, Adam Marblestone, and Thomas Dean. 2014. “The Atoms of Neural Computation.” Science 346 (6209): 551–52. https://doi.org/10.1126/science.1261661.

Olshausen, Bruno A., and David J. Field. 1996. “Emergence of Simple-Cell Receptive Field Properties by Learning a Sparse Code for Natural Images.” Nature 381 (6583): 607–9. https://doi.org/10.1038/381607a0.

Olshausen, Bruno A, and David J Field. 2004. “Sparse Coding of Sensory Inputs.” Current Opinion in Neurobiology 14 (4): 481–87. https://doi.org/10.1016/j.conb.2004.07.007.

Orellana, Josue, Jordan Rodu, and Robert E. Kass. 2017. “Population Vectors Can Provide Near Optimal Integration of Information.” Neural Computation 29 (8): 2021–9. https://doi.org/10.1162/neco_a_00992.

Parallel Distributed Processing: Explorations in the Microstructure of Cognition. 1986. MIT Press.

Sandkühler, J., and A. A. Eblen-Zajjur. 1994. “Identification and Characterization of Rhythmic Nociceptive and Non-Nociceptive Spinal Dorsal Horn Neurons in the Rat.” Neuroscience 61 (4): 991–1006. https://doi.org/10.1016/0306-4522(94)90419-7.

Sasahara, Kazutoshi, Martin L. Cody, David Cohen, and Charles E. Taylor. 2012. “Structural Design Principles of Complex Bird Songs: A Network-Based Approach.” PLoS ONE 7 (9): –44436. https://doi.org/10.1371/journal.pone.0044436.

Shen, Yanning, Brian Baingana, and Georgios B. Giannakis. 2016. “Nonlinear Structural Vector Autoregressive Models for Inferring Effective Brain Network Connectivity,” October. http://arxiv.org/abs/1610.06551.

Simoncelli, Eero P, and Bruno A Olshausen. 2001. “Natural Image Statistics and Neural Representation.” Annual Review of Neuroscience 24 (1): 1193–1216. https://doi.org/10.1146/annurev.neuro.24.1.1193.

Smith, A, and E Brown. 2003. “Estimating a State-Space Model from Point Process Observations.” Neural Computation 15 (5): 965–91. https://doi.org/10.1162/089976603765202622.

Smith, Evan C., and Michael S. Lewicki. 2004. “Learning Efficient Auditory Codes Using Spikes Predicts Cochlear Filters.” In Advances in Neural Information Processing Systems, 1289–96. http://machinelearning.wustl.edu/mlpapers/paper_files/NIPS2005_832.pdf.

———. 2006. “Efficient Auditory Coding.” Nature 439 (7079): 978–82. https://doi.org/10.1038/nature04485.

Smith, Evan, and Michael S. Lewicki. 2005. “Efficient Coding of Time-Relative Structure Using Spikes.” Neural Computation 17 (1): 19–45. https://doi.org/10.1162/0899766052530839.

Stolk, Arjen, Matthijs L. Noordzij, Lennart Verhagen, Inge Volman, Jan-Mathijs Schoffelen, Robert Oostenveld, Peter Hagoort, and Ivan Toni. 2014. “Cerebral Coherence Between Communicators Marks the Emergence of Meaning.” Proceedings of the National Academy of Sciences 111 (51): 18183–8. https://doi.org/10.1073/pnas.1414886111.

Strong, Steven P, Roland Koberle, Rob R de Ruyter van Steveninck, and William Bialek. 1998. “Entropy and Information in Neural Spike Trains.” Phys. Rev. Lett. 80 (1): 197–200. https://doi.org/10.1103/PhysRevLett.80.197.

Vargas-Irwin, Carlos E., David M. Brandman, Jonas B. Zimmermann, John P. Donoghue, and Michael J. Black. 2015. “Spike Train SIMilarity Space (SSIMS): A Framework for Single Neuron and Ensemble Data Analysis.” Neural Computation 27 (1): 1–31. https://doi.org/10.1162/NECO_a_00684.

Volgushev, Maxim, Vladimir Ilin, and Ian H. Stevenson. 2015. “Identifying and Tracking Simulated Synaptic Inputs from Neuronal Firing: Insights from in Vitro Experiments.” PLoS Computational Biology 11 (3). https://doi.org/10.1371/journal.pcbi.1004167.

Zeki, Semir, John Paul Romaya, Dionigi M. T. Benincasa, and Michael F. Atiyah. 2014. “The Experience of Mathematical Beauty and Its Neural Correlates.” Frontiers in Human Neuroscience 8. https://doi.org/10.3389/fnhum.2014.00068.