Discriminating neural ensemble patterns through dendritic computations in randomly connected feedforward networks

Curation statements for this article:
  • Curated by eLife

    eLife logo

    eLife Assessment

    This study presents useful quantitative insights into the prevalence of functionally clustered synaptic inputs on neuronal dendrites. The simple analytical calculations and computer simulations provide solid support for the main arguments. With improvements to the presentation and more realistic simulations (e.g. including the interaction between calcium and electric signals) the findings can lead to a more detailed understanding of how dendrites contribute to the computation of neuronal networks.

This article has been Reviewed by the following groups

Read the full article

Abstract

Co-active or temporally ordered neural ensembles are a signature of salient sensory, motor, and cognitive events. Local convergence of such patterned activity as synaptic clusters on dendrites could help single neurons harness the potential of dendritic nonlinearities to decode neural activity patterns. We combined theory and simulations to assess the likelihood of whether projections from neural ensembles could converge onto synaptic clusters even in networks with random connectivity. Using rat hippocampal and cortical network statistics, we show that clustered convergence of axons from 3-4 different co-active ensembles is likely even in randomly connected networks, leading to representation of arbitrary input combinations in at least ten target neurons in a 100,000 population. In the presence of larger ensembles, spatiotemporally ordered convergence of 3-5 axons from temporally ordered ensembles is also likely. These active clusters result in higher neuronal activation in the presence of strong dendritic nonlinearities and low background activity. We mathematically and computationally demonstrate a tight interplay between network connectivity, spatiotemporal scales of subcellular electrical and chemical mechanisms, dendritic nonlinearities, and uncorrelated background activity. We suggest that dendritic clustered and sequence computation is pervasive, but its expression as somatic selectivity requires confluence of physiology, background activity, and connectomics.

Article activity feed

  1. eLife Assessment

    This study presents useful quantitative insights into the prevalence of functionally clustered synaptic inputs on neuronal dendrites. The simple analytical calculations and computer simulations provide solid support for the main arguments. With improvements to the presentation and more realistic simulations (e.g. including the interaction between calcium and electric signals) the findings can lead to a more detailed understanding of how dendrites contribute to the computation of neuronal networks.

  2. Reviewer #1 (Public Review):

    In the current manuscript, the authors use theoretical and analytical tools to examine the possibility of neural projections to engage ensembles of synaptic clusters in active dendrites. The analysis is divided into multiple models that differ in the connectivity parameters, speed of interactions, and identity of the signal (electric vs. second messenger). They first show that random connectivity almost ensures the representation of presynaptic ensembles. As expected, this convergence is much more likely for small group sizes and slow processes, such as calcium dynamics. Conversely, fast signals (spikes and postsynaptic potentials) and large groups are much less likely to recruit spatially clustered inputs. Dendritic nonlinearity in the postsynaptic cells was found to play a highly important role in distinguishing these clustered activation patterns, both when activated simultaneously and in sequence. The authors tackled the difficult issue of noise, showing a beneficiary effect when noise 'happens' to fill in gaps in a sequential pattern but degraded performance at higher background activity levels. Last, the authors simulated selectivity to chemical and electrical signals. While they find that longer sequences are less perturbed by noise, in more realistic activation conditions, the signals are not well resolved in the soma.

    While I think the premise of the manuscript is worth exploring, I have a number of reservations regarding the results.

    (1) In the analysis, the authors made a simplifying assumption that the chemical and electrical processes are independent. However, this is not the case; excitatory inputs to spines often trigger depolarization combined with pronounced calcium influx; this mixed signaling could have dramatic implications on the analysis, particularly if the dendrites are nonlinear (see below)

    (2) Sequence detection in active dendrites is often simplified to investigating activation in a part of or the entirety of individual branches. However, the authors did not do that for most of their analysis. Instead, they treat the entire dendritic tree as one long branch and count how many inputs form clusters. I fail to see why simplification is required and suspect it can lead to wrong results. For example, two inputs that are mapped to different dendrites in the 'original' morphology but then happen to fall next to each other when the branches are staggered to form the long dendrites would be counted as neighbors.

    (3) The simulations were poorly executed. Figures 5 and 6 show examples but no summary statistics. The authors emphasize the importance of nonlinear dendritic interactions, but they do not include them in their analysis of the ectopic signals! I find it to be wholly expected that the effects of dendritic ensembles are not pronounced when the dendrites are linear.

    To provide a comprehensive analysis of dendritic integration, the authors could simulate more realistic synaptic conductances and voltage-gated channels. They would find much more complicated interactions between inputs on a single site, a sliding temporal and spatial window of nonlinear integration that depends on dendritic morphology, active and passive parameters, and synaptic properties. At different activation levels, the rules of synaptic integration shift to cooperativity between different dendrites and cellular compartments, further complicated by nonlinear interactions between somatic spikes and dendritic events.

    While it is tempting to extend back-of-the-napkin calculations of how many inputs can recruit nonlinear integration in active dendrites, the biological implementation is very different from this hypothetical. It is important to consider these questions, but I am not convinced that this manuscript adequately addressed the questions it set out to probe, nor does it provide information that was unknown beforehand.

  3. Reviewer #2 (Public Review):

    Summary:

    If synaptic input is functionally clustered on dendrites, nonlinear integration could increase the computational power of neural networks. But this requires the right synapses to be located in the right places. This paper aims to address the question of whether such synaptic arrangements could arise by chance (i.e. without special rules for axon guidance or structural plasticity), and could therefore be exploited even in randomly connected networks. This is important, particularly for the dendrites and biological computation communities, where there is a pressing need to integrate decades of work at the single-neuron level with contemporary ideas about network function.

    Using an abstract model where ensembles of neurons project randomly to a postsynaptic population, back-of-envelope calculations are presented that predict the probability of finding clustered synapses and spatiotemporal sequences. Using data-constrained parameters, the authors conclude that clustering and sequences are indeed likely to occur by chance (for large enough ensembles), but require strong dendritic nonlinearities and low background noise to be useful.

    Strengths:

    (1) The back-of-envelope reasoning presented can provide fast and valuable intuition. The authors have also made the effort to connect the model parameters with measured values. Even an approximate understanding of cluster probability can direct theory and experiments towards promising directions, or away from lost causes.

    (2) I found the general approach to be refreshingly transparent and objective. Assumptions are stated clearly about the model and statistics of different circuits. Along with some positive results, many of the computed cluster probabilities are vanishingly small, and noise is found to be quite detrimental in several cases. This is important to know, and I was happy to see the authors take a balanced look at conditions that help/hinder clustering, rather than to just focus on a particular regime that works.

    (3) This paper is also a timely reminder that synaptic clusters and sequences can exist on multiple spatial and temporal scales. The authors present results pertaining to the standard `electrical' regime (~50-100 µm, <50 ms), as well as two modes of chemical signaling (~10 µm, 100-1000 ms). The senior author is indeed an authority on the latter, and the simulations in Figure 5, extending those from Bhalla (2017), are unique in this area. In my view, the role of chemical signaling in neural computation is understudied theoretically, but research will be increasingly important as experimental technologies continue to develop.

    Weaknesses:

    (1) The paper is mostly let down by the presentation. In the current form, some patience is needed to grasp the main questions and results, and it is hard to keep track of the many abbreviations and definitions. A paper like this can be impactful, but the writing needs to be crisp, and the logic of the derivation accessible to non-experts. See, for instance, Stepanyants, Hof & Chklovskii (2002) for a relevant example.

    It would be good to see a restructure that communicates the main points clearly and concisely, perhaps leaving other observations to an optional appendix. For the interested but time-pressed reader, I recommend starting with the last paragraph of the introduction, working through the main derivation on page 7, and writing out the full expression with key parameters exposed. Next, look at Table 1 and Figure 2J to see where different circuits and mechanisms fit in this scheme. Beyond this, the sequence derivation on page 15 and biophysical simulations in Figures 5 and 6 are also highlights.

    (2) I wonder if the authors are being overly conservative at times. The result highlighted in the abstract is that 10/100000 postsynaptic neurons are expected to exhibit synaptic clustering. This seems like a very small number, especially if circuits are to rely on such a mechanism. However, this figure assumes the convergence of 3-5 distinct ensembles. Convergence of inputs from just 2 ensembles would be much more prevalent, but still advantageous computationally. There has been excitement in the field about experiments showing the clustering of synapses encoding even a single feature.

    (3) The analysis supporting the claim that strong nonlinearities are needed for cluster/sequence detection is unconvincing. In the analysis, different synapse distributions on a single long dendrite are convolved with a sigmoid function and then the sum is taken to reflect the somatic response. In reality, dendritic nonlinearities influence the soma in a complex and dynamic manner. It may be that the abstract approach the authors use captures some of this, but it needs to be validated with simulations to be trusted (in line with previous work, e.g. Poirazi, Brannon & Mel, (2003)).

    (4) It is unclear whether some of the conclusions would hold in the presence of learning. In the signal-to-noise analysis, all synaptic strengths are assumed equal. But if synapses involved in salient clusters or sequences were potentiated, presumably detection would become easier? Similarly, if presynaptic tuning and/or timing were reorganized through learning, the conditions for synaptic arrangements to be useful could be relaxed. Answering these questions is beyond the scope of the study, but there is a caveat there nonetheless.