SIAM News Blog

Engineered for Function: The Power of Biologically-Constrained Neural Networks for Neurosensory Integration

By Charles B. Delahunt, Charles Fieseler, and J. Nathan Kutz

New opportunities to build bio-inspired models for neurosensory integration arise from data-driven modeling methods. The emergence of rich multimodal data recordings of neurosensory processing systems also enhances biologically-motivated models. We are entering a golden age of access to biological data and structures, made possible by a diverse set of genetically-tailored organisms and a range of recording and stimulation methods, including functional magnetic resonance imaging, electrode arrays, calcium imaging, and optogenetics. Given that many biological architectures resulted from millions of years of competitive pressure to robustly accomplish certain tasks, understanding the network architecture promises payoffs in terms of novel, valuable functionalities that one can apply to machine learning (ML) and artificial intelligence (AI) methods.

Recognizing how the form and structure of neural pathways transform input stimuli into motor-neuron-driven behavioral responses is of particular interest. Data-driven models can integrate neurosensory information into a modern learning and control theoretic framework, enhancing our comprehension of the role of network structure and function. We highlight two model organisms that exploit different network architectures for functionality: the Manduca sexta moth and Caenorhabditis elegans (roundworm). The former uses a large, randomly-connected network for processing sensory (olfactory) information and learning, while the latter functions via a small, stereotyped connectivity graph. These two alternative approaches offer insight into the range of neurosensory strategies that produce robust and stable behaviors for organisms in environments with noisy stimuli.

Neuroscience and ML techniques have existed in partnership for many decades. For instance, neural networks (NNs) were inspired by the Nobel Prize-winning work of David Hubel and Torsten Wiesel, who demonstrated that NNs in the primary visual cortex of cats are organized in hierarchical cell layers to process visual stimuli. The study of NNs is currently textbook material for both neuroscientists and the deep learning community. Anatomical studies and neuronal recordings provide increasing detail of biological structure that helps us move beyond abstract models. Biologically-constrained architectures are critical when explaining the exceptional performance of neurosensory integration with limited and noisy input stimuli, and thus have the power to further revolutionize NN design.

Manduca sexta and Fast Learning

The insect olfactory network—including the antennal lobe (AL) and mushroom bodies (MB)—has evolved for robust, rapid learning. The AL-MB’s key anatomical features include competitive inhibition, random and sparse connectivity, neuromodulator stimulation, Hebbian weight updates (fire together, wire together), and large dimension shifts between layers [3, 10, 14]. Though these elements are endemic in biological NNs, researchers do not typically apply them in the context of ML [1, 13]. A model loosely based on the honeybee MB with Hebbian updates was a step toward application of biological constraints [7]. MothNet applied tighter constraints and calibration to in vivo electrode recordings [5] and modeled the full AL-MB — subject to neuromodulators during learning. Unlike standard NN models, the olfaction processing model is dynamic in nature and contains neurons that obey a firing rate model of the form

\[\frac{d\boldsymbol{x}}{dt}= f(\boldsymbol{x},\boldsymbol{W}),\]

where \(\boldsymbol{x}\) depicts the neuronal dynamics and \(\boldsymbol{W}\) represents the connection weights trained for odor classification at the readout neurons (see Figure 1).

Figure 1. Olfaction processing in the antennal lobe. 1a. Three-dimensional reconstruction of a dopaminergic neuron in Manduca sexta. 1b. A neural network (NN) based on the Manduca sexta antennal lobe and mushroom bodies. This NN outperforms standard machine learning methods at rapid learning of digits from the Modified National Institute of Standards and Technology database. Figure 1a courtesy of [13], 1b courtesy of [4].

New rewarded odors are subjected to neuromodulators, which provide weight updates between neurons \(i\) and \(j\) as \(\triangle \boldsymbol{W}_{ij}=\gamma f_i(t)f_j(t)\) — where \(f_j(t)\) is the firing rate of the \(j\)th neuron. A crucial payoff of this approach was reproduction of the actual insect’s rapid learning; MothNet attained close to 80 percent accuracy on the Modified National Institute of Standards and Technology (MNIST) database, given just one to 10 samples per class. It outperformed standard ML techniques—including specialized one-shot methods—in this rapid learning regime [4]. These results indicate that one can usefully port the AL-MB structure to a ML context.

In short, tighter biological constraints can yield novel, useful results. Such constraints also inspire new perspectives; the idea of massive training data—which NNs take for granted—is alien to the insect AL-MB. A bug requiring 60,000 training samples (routine for MNIST studies) would be dead. Striving for 99.9 percent accuracy is also unfamiliar to the insect AL-MB, since it has no need for such precision. Insect architectures instead excel at rapid, low-fidelity learning and outperformed ML methods in this regime. Given the rich variety of biological NNs, many other learning architectures and mechanisms—including dynamic data processing and energy/power constraints associated with neuronal wiring—remain unexplored in the context of ML.

C. elegans and the 300

Figure 2. Whole-brain imaging has produced novel datasets of the activity of a network with stereotyped connectivity. Figure courtesy of [8].
C. elegans has 302 neurons for performing all of its varied life-sustaining tasks, including chemotaxis, predator avoidance, and mating. These actions typically involve a sequence of primary behaviors — such as forward crawling, backward crawling, omega turns, and head sweeps. The worms tend to live in noisy stimulus environments and use a small number of sparsely-connected neurons to robustly navigate. The full connectome is characterized, as is almost every aspect of C. elegans anatomy. Despite a wealth of detailed knowledge, current state-of-the-art ML has difficulty capturing how the worms survive and compute neurosensory information. C. elegans may have broader principles of network design that one can potentially learn in order to engineer robust functional behaviors with limited resources.

Much existing modeling work on C. elegans does not include the dynamical structure of the worm [2, 6, 9], whose connectome encodes a number of critical behaviors. Indeed, the connectomic structure itself seems ideally designed for the integration of proprioceptic feedback for efficient locomotion [9]. It is thus highly likely that C. elegans’ specific wiring diagram is engineered for its functional repertoire of behaviors. Emergence of the worm’s whole brain imaging will further revolutionize our ability to posit models constrained by known structure and dynamics (see Figures 2 and 3) [8, 12]. Such data can allow us to mathematically move toward plausible data-driven control models of the form

Figure 3. The neural data from Figure 2 live on a low-dimensional manifold with discrete states. Figure courtesy of [12].
\[\frac{d\boldsymbol{x}}{dt}= \boldsymbol{Ax}+\boldsymbol{Bu},\]

where the data alone initiates the discovery of matrices \(\boldsymbol{A}\) and \(\boldsymbol{B}\) and control signal \(\boldsymbol{u}\) [11]. We can perform this regression process in a supervised or unsupervised fashion, therefore framing C. elegans in a classic control context and providing a firm theoretical foundation for the characterization of neuronal control laws. It is very likely that C. elegans will be the first model organism that researchers understand completely from a neurosensory integration perspective.

Outlook: New Data, New Models

Our work has demonstrated the importance of incorporating organism-specific knowledge into modeling efforts to truly comprehend their design principles. We can only realize the amazing promise of robust signal processing in noisy environments using small networks via the newly possible combination of in vivo datasets and physiological constraints. The emergence of both mathematical methods and innovative recordings allows for significant improvements in our understanding of neurosensory integration, network functionality, and robustness. We therefore might also expect the imbuing of NN architectures with biological constraints to yield significant improvements in ML and AI structures. This could lead to increased robustness, significant reduction in training data, and/or more energy- and memory-efficient network designs. Ultimately, the rich interplay between neuroscience and ML is set to accelerate high dividends in science and technology.

J. Nathan Kutz presented this research during an invited talk at the 2018 SIAM Conference on the Life Sciences, which took place last year in Minneapolis, Minn. 

[1] Bartunov, S., Santoro, A., Richards, B., Marris, L., Hinton, G.E., & Lillicrap, T. (2018). Assessing the scalability of biologically-motivated deep learning algorithms and architectures. Adv. Neur. Info. Process. Syst., 9390-9400.
[2] Boyle, J.H., Stefano, B., & Cohen, N. (2012). Gait modulation in C. elegans: an integrated neuromechanical model. Front. Comp. Neurosci., 6(10).
[3] Dacks, A.M., Riffell, J.A., Martin, J.P., Gage, S.L., & Nighorn, A.J. (2012). Olfactory modulation by dopamine in the context of aversive learning. J. Neurophysiol., 108(2), 539-550.
[4] Delahunt, C.B., & Kutz, J.N. (2019). Putting a bug in ML: The moth olfactory network learns to read MNIST. Neur. Net. To be published.
[5] Delahunt, C.B., Riffell, J.A., & Kutz, J.N. (2018). Biological Mechanisms for Learning: A Computational Model of Olfactory Learning in the Manduca sexta Moth, with Applications to Neural Nets. Front. Comp. Neurosci., 12(102).
[6] Fieseler, C., James, K.-G., & Kutz, J.N. (2018). The control structure of the nematode Caenorhabditis elegans: Neuro-sensory integration and proprioceptive feedback. J. Biomech., 74, 1-8.
[7] Huerta, R., & Nowotny, T. (2009). Fast and Robust Learning by Reinforcement Signals: Explorations in the Insect Brain. Neur. Comp., 21(8), 2123-2151. 
[8] Kato, S., Kaplan, H.S., Schrödel, T., Skora, S., Lindsay, T.H., Yemini, E., …Zimmer, M. (2015). Global brain dynamics embed the motor command sequence of Caenorhabditis elegans. Cell, 163(3), 656-669.
[9] Kunert, J.M., Proctor, J.L., Brunton, S.L., & Kutz, J.N. (2017). Spatiotemporal feedback and network structure drive and encode Caenorhabditis elegans locomotion. PLoS Comp. Biol., 13(1), e1005303.
[10] Martin, J.P., Beyerlein, A., Dacks, A.M., Reisenman, C.E., Riffell, J.A., Lei, H., & Hildebrand, J.G. (2011). The neurobiology of insect olfaction: Sensory processing in a comparative context. Prog. Neurobiol., 95, 427-447.
[11] Proctor, J.L., Brunton, S.L., & Kutz, J.N. (2016). Dynamic mode decomposition with control. SIAM J. Appl. Dyn. Sys., 15(1), 142-161.
[12] Shipley, F.B., Clark, C.M., Alkema, M.J., & Leifer, A.M. (2014). Simultaneous optogenetic manipulation and calcium imaging in freely moving C. elegans. Front. Neuro. Circ., 8(28).
[13] Srinivasan, S., Greenspan, R.J., Stevens, C.F., & Grover, D. (2018). Deep(er) Learning. J. Neurosci., 38(34), 7365-7374.
[14] Wilson, R.I., Turner, G.C., & Laurent, G. (2004). Transformation of olfactory representations in the Drosophila antennal lobe. Science, 303(5656), 366-370.

Charles B. Delahunt is a postdoctoral fellow in the Department of Applied Mathematics at the University of Washington, where he develops bio-inspired neural network architectures and machine learning (ML) algorithms. He also applies ML methods to global health problems at Global Good in Bellevue, Wash. Charles Fieseler is a graduate student in the Department of Physics at the University of Washington, where he develops data-driven biophysical models of the interaction of neuronal networks and sensory processing with biomechanics. J. Nathan Kutz is a professor of applied mathematics at the University of Washington, where he works at the intersection of data analysis and dynamical systems.

blog comments powered by Disqus