Abstract
We consider the effects of correlations between the in- and out-degrees of individual neurons on the dynamics of a network of neurons. By using theta neurons, we can derive a set of coupled differential equations for the expected dynamics of neurons with the same in-degree. A Gaussian copula is used to introduce correlations between a neuron’s in- and out-degree, and numerical bifurcation analysis is used determine the effects of these correlations on the network’s dynamics. For excitatory coupling, we find that inducing positive correlations has a similar effect to increasing the coupling strength between neurons, while for inhibitory coupling it has the opposite effect. We also determine the propensity of various two- and three-neuron motifs to occur as correlations are varied and give a plausible explanation for the observed changes in dynamics.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Determining the effects of a network’s structure on its dynamics is an issue of great interest, particularly in the case of a network of neurons [15, 20, 26, 27]. Since neurons form directed synaptic connections, a neuron has both an in-degree—the number of neurons connected to it, and an out-degree—the number of neurons it connects to. In this paper, we present a framework for investigating the effects of correlations, both positive and negative, between these two quantities. To isolate the effects of these correlations, we assume no other structure in the networks, i.e. random connectivity based on the neurons’ degrees.
A number of other authors have considered this issue, and we now summarise relevant aspects of their results. LaMar and Smith [13] considered directed networks of identical pulse-coupled phase oscillators and mostly concentrated on the probability that the network would fully synchronise, and the time taken to do so. Vasquez et al. [30] considered binary neurons whose states were updated at discrete times, and found that negative degree correlations stabilised a low firing rate state, for excitatory coupling. A later paper [15] considered more realistic spiking neurons, had a mix of excitatory and inhibitory neurons, and concentrated more on the network’s response to transient stimuli, as well as analysis of network properties such as mean shortest path. Several authors have considered networks for which the in- and out-degrees of a neuron are equal, thereby inducing positive correlations between them [9, 27].
Vegué and Roxin [32] considered large networks of both excitatory and inhibitory leaky integrate-and-fire neurons and used a mean-field formalism to determine steady-state distributions of firing rates within neural populations. They considered the effects of within-neuron degree correlations for the excitatory-to-excitatory connections, and sometimes varied the probability of inhibitory-to-excitatory connections in order to create a “balanced state”. Nykamp et al. [20] also considered large networks of both excitatory and inhibitory neurons and used a Wilson–Cowan-type firing rate model to investigate the effects of within-neuron degree correlations. They showed that once correlations were included, the dynamics are effectively four-dimensional, in contrast to the two-dimensional dynamics expected from a standard rate-based excitatory/inhibitory network. They also related the degree distributions to cortical motifs. Experimental evidence for within-neuron degree correlations is given in [31].
The structure of the paper is as follows. In Sect. 2, we present the model network and summarise the analysis of [1] showing that under certain assumptions, the network can be described by a coupled set of ordinary differential equations, one for the dynamics associated with each distinct in-degree. In Sect. 3, we discuss how to generate correlated in- and out-degrees using a Gaussian copula. Our model involves sums over all distinct in-degrees, and in Sect. 4 we present a computationally efficient method for evaluating these sums, in analogy with Gaussian quadrature. Our main results are in Sect. 5, and we show in Sect. 6 that they also occur in networks of more realistic Morris–Lecar spiking neurons. We discuss motifs in Sect. 7 and conclude in Sect. 8.
2 Model
We consider the same model of pulse-coupled theta neurons as in [1]. The governing equations are
for \(i=1,2\ldots N\), where the phase angle \(\theta _i\) characterises the state of neuron i, which fires an action potential as \(\theta _i\) increases through \(\pi \),
K is the strength of connections within the network, \(A_{ij}=1\) if there is a connection from neuron j to neuron i and \(A_{ij}=0\) otherwise, \(\langle k\rangle \) is the average degree, \(\sum _{i,j}A_{ij}/N\), and \(P_n(\theta )=a_n(1-\cos {\theta })^n\) where \(a_n\) is chosen such that \(\int _0^{2\pi }P_n(\theta )\hbox {d}\theta =1\). The function \(P_n(\theta _j)\) models the pulse of current emitted by neuron j when it fires and can be made arbitrarily “spike-like” and localised around \(\theta _j=\pi \) by increasing n. The parameter \(\eta _i\) is the input current to neuron i in the absence of coupling, and the \(\eta _i\) are independently and randomly chosen from a Lorentzian distribution:
Chandra et al. [1] considered the limit of large N and assumed that the network can be characterised by two functions. Firstly, a degree distribution \(P(\mathbf {k})\), normalised so that \(\sum _\mathbf {k}P(\mathbf {k})=N\), where \(\mathbf {k}=(k_\mathrm{in},k_\mathrm{out})\) and \(k_\mathrm{in}\) and \(k_\mathrm{out}\) are the in- and out-degrees, respectively, of a neuron with degree \(\mathbf {k}\). Secondly, an assortativity function \(a(\mathbf {k}' \rightarrow \mathbf {k})\) giving the probability of a connection from a neuron with degree \(\mathbf {k}'\) to one with degree \(\mathbf {k}\), given that such neurons exist. Whereas [1] investigated the effects of varying \(a(\mathbf {k}' \rightarrow \mathbf {k})\), here we consider the default value for this function [i.e. its value expected by chance, see (11)] and investigate the effects of varying correlations between \(k_\mathrm{in}\) and \(k_\mathrm{out}\) as specified by the degree distribution \(P(\mathbf {k})\). We emphasise that we are only considering within-neuron degree correlations and are not considering degree assortativity, which refers to the probability of neurons with specified degrees being connected to one another [1, 25].
In the limit \(N\rightarrow \infty \), the network can be described by a probability distribution \(f(\theta ,\eta |\mathbf {k},t)\), where \(f(\theta ,\eta |\mathbf {k},t)\hbox {d}\theta \ \hbox {d}\eta \) is the probability that a neuron with degree \(\mathbf {k}\) has phase angle in \([\theta ,\theta +\hbox {d}\theta ]\) and value of \(\eta \) in \([\eta ,\eta +\hbox {d}\eta ]\) at time t. This distribution satisfies the continuity equation
where v is the continuum version of the right-hand side of (1):
System (4)–(5) is amenable to the use of the Ott/Antonsen ansatz [22, 23], and using standard techniques [3, 10, 12, 14] one can show that the long-time dynamics of the system is described by
where (having chosen \(n=2\))
The quantity
can be regarded as a complex-valued “order parameter” for neurons with degree \(\mathbf {k}\) at time t. The function \(G(\mathbf {k}',t)\) can be regarded as the output current from neurons with degree \(\mathbf {k}'\), and its form results from rewriting the pulse function \(P_n(\theta )\) in terms of \(b(\mathbf {k}',t)\). [For general n, \(G(\mathbf {k}',t)\) is the sum of a degree-n polynomial in \(b(\mathbf {k}',t)\) and one in \(\bar{b}(\mathbf {k}',t)\) (the conjugate of \(b(\mathbf {k}',t)\)) [10, 14]. One can take the limit \(n\rightarrow \infty \) and obtain \(G(\mathbf {k}',t)=(1-|b(\mathbf {k}',t)|^2)/(1+b(\mathbf {k}',t)+\bar{b}(\mathbf {k}',t)+|b(\mathbf {k}',t)|^2)\).] Note that the parameters of Lorentzian (3) appear in (6) as a result of evaluating the integral over \(\eta '\) in (5). Equation (6) only describes the long-time asymptotic behaviour of network (1), on the “Ott/Antonsen manifold”, and thus may not fully describe transients from arbitrary initial conditions, nor the effects of stimuli which move the network off this manifold.
One can also marginalise \(f(\theta ,\eta |\mathbf {k},t)\) over \(\eta \) to obtain the distribution of \(\theta \) for each \(\mathbf {k}\) and t:
a unimodal function with maximum at \(\theta =\arg (b(\mathbf{k},t))\). The firing rate of neurons with degree \(\mathbf {k}\) is equal to the flux through \(\theta =\pi \), i.e.
where we have used the fact that \(\hbox {d}\theta /\hbox {d}t=2\) when \(\theta =\pi \).
Suppose our network has neutral assortativity, i.e. neurons are randomly connected with the probability of connection being determined by just their relevant degrees. Then [1, 25]
and [writing \(P(k_\mathrm{in}',k_\mathrm{out}',\hat{\rho })\) instead of \(P(\mathbf {k}')\) from now on, where \(\hat{\rho }\) is a parameter used to calibrate the desired correlation between \(k_\mathrm{in}'\) and \(k_\mathrm{out}'\), defined below in (17)]
This quantity is proportional to the input to a neuron with degree \((k_\mathrm{in},k_\mathrm{out})\) from other neurons within the network, but it is clearly independent of \(k_\mathrm{out}\), so the state of a neuron with degree \((k_\mathrm{in},k_\mathrm{out})\) must also be independent of \(k_\mathrm{out}\), and thus G must be independent of \(k_\mathrm{out}'\). So the expression in (12) can be written
where
The function Q can be thought of as a \(k_\mathrm{in}'\)-dependent mean of \(k_\mathrm{out}'\) which is also dependent on the correlations between \(k_\mathrm{in}'\) and \(k_\mathrm{out}'\).
Our model equations are thus
where \(k_\mathrm{in}\) takes on integer values between the minimum and maximum in-degrees. The correlation between in- and out-degrees of a neuron is controlled by \(\hat{\rho }\), as explained below, and this appears as a parameter in (14).
It is interesting to compare (14)–(15) with the heuristic rate equation in [20]. These authors characterised a neuron by its “f–I curve”—a nonlinear function transforming input current into a firing rate. They concluded that the input current to a neuron is proportional to two quantities: (i) its in-degree, and (ii) the sum over in- and out-degrees of presynaptic neurons of the product of the joint degree distribution, the out-degree of the presynaptic neuron, and the “output” of presynaptic neurons. We also find this form of equation.
We note that the transformation \(V=\tan {(\theta /2)}\) maps a theta neuron to a quadratic integrate-and-fire (QIF) neuron with threshold and resets of \(\pm \infty \), and that for the special case \(n=\infty \) one could derive an equivalent pair of real equations rather than the single Eq. (15) where the two real variables are the mean voltage and firing rate of the QIF neurons with a specific in-degree [17].
3 Generating correlated in- and out-degrees
We now turn to the problem of deriving \(P(k_\mathrm{in}',k_\mathrm{out}',\hat{\rho })\) and thus \(Q(k_\mathrm{in}',\hat{\rho })\). For simplicity, we choose the distributions of both the in- and out-degrees to be the same, namely power law distributions with exponent − 3, truncated below and above at degrees a and b, respectively. (Evidence for power law distributions in the human brain is given in [4], for example.) So the probability distribution function of either in- or out-degree k is
where the normalisation factor results from approximating the sum from a to b by an integral. (The approximation improves as a and b are both increased.) We want to introduce correlations between the in- and out-degrees of a neuron while retaining these marginal distributions. We do this using a Gaussian copula [18]. The correlated bivariate normal distribution with zero mean is
where
and \(\hat{\rho }\in (-1,1)\) is the correlation between x and y. The variables x and y have no physical meaning, and we use the copula just as a way of deriving an analytic expression for \(P(k_\mathrm{in}',k_\mathrm{out}',\hat{\rho })\) for which the correlations between \(k_\mathrm{in}'\) and \(k_\mathrm{out}'\) can be varied systematically.
The marginal distributions for x and y are the same:
as are their cumulative distribution functions:
We define the cumulative distribution function of f:
and also have the cumulative distribution function for a degree k:
where we have treated k as a continuous variable and again approximated a sum by an integral.
We thus have the joint cumulative distribution function for \(k_\mathrm{in}\) and \(k_\mathrm{out}\)
The joint degree distribution for \(k_\mathrm{in}\) and \(k_\mathrm{out}\) is then
where the primes indicate differentiation with respect to the relevant k. Now
so
and
Substituting these into (24) and simplifying, we find
Note that for \(\hat{\rho }=0\), this simplifies to \(p(k_\mathrm{in})p(k_\mathrm{out})\), as expected. Examples of \(P(k_\mathrm{in},k_\mathrm{out},\hat{\rho })\) for different \(\hat{\rho }\) are shown in Fig. 1. Both Zhao et al. [33] and LaMar and Smith [13] used Gaussian copulas to create networks with correlated in- and out-degrees as done here, but did not derive an expression of form (29).
We need to relate \(\hat{\rho }\), a parameter in (29), to \(\rho \), the Pearson’s correlation coefficient between in- and out-degrees of a neuron (note: not between two connected neurons). We have
where \(\tilde{\varSigma }\) indicates a sum over all \(k_\mathrm{in}\) and \(k_\mathrm{out}\). \(\rho \) as a function of \(\hat{\rho }\) is shown in Fig. 2. We see that the relationship is monotonic, and while it is possible to obtain values of \(\rho \) close to 1, the lower limit is approximately − 0.6. By varying \(\hat{\rho }\) in (15), we can thus investigate the effects of varying the correlation coefficient between in- and out-degrees of a neuron (\(\rho \)) on the dynamics of a network. Note that for the distributions used here, treating k as a continuous variable, \(\langle k \rangle =2ab/(b+a)\).
Keeping in mind the normalisation \(\sum _\mathbf {k}P(\mathbf {k})=N\), we write \(Q(k_\mathrm{in}',\hat{\rho })\) as
Note that the factor of N here cancels with that in the last term in (15), giving equations which do not explicitly depend on N. Examples of \(Q(k_\mathrm{in}',\hat{\rho })\) for different \(\hat{\rho }\) are shown in Fig. 3. We see that increasing \(\hat{\rho }\) gives more weight to high in-degree nodes and less to low in-degree nodes and vice versa.
4 Reduced model
We now turn to the issue of evaluating the sums over degrees in both (31) and (15). Although such sums are typically over only several hundred terms, it is possible to accurately evaluate them using many fewer terms, in analogy with Gaussian quadrature [5].
Defining an inner product as the sum
we assume that there is a corresponding set of orthogonal polynomials \(\{q_n(k)\}_{0\le n}\) associated with this product. These polynomials satisfy the three-term recurrence relationship
where
\(q_0(k)=1\) and \(q_{-1}(k)=0\). Then for a given positive integer n, assuming that f is 2n times continuously differentiable, we have the Gaussian summation formula
with error
where \(x_i\) are the n roots of \(q_n\), \(\xi \in [a,b]\), and the weights \(w_i\) are discussed below. Note that the roots of \(q_n(k)\) are typically not integers, but this does not matter if the function f(k) can be evaluated for arbitrary k.
In practice, to find the roots of \(q_n\) we use the Golub–Welsch algorithm. Form the tridiagonal matrix
The eigenvalues of J are the \(\{x_i\}\) and if all eigenvectors, \(v_i\), are scaled to have norm 1, then \(w_i=(b-a)\left( v_i^{(1)}\right) ^2\), where \(v_i^{(1)}\) is the first component of \(v_i\).
We will use the approximation
where \(n\ll b-a+1\), the number of terms in the original sum. Given the resemblance of the sum on the left in (39) to the integral of f(k) between \(k=a\) and \(k=b\), it is not surprising that the roots of \(p_n\), when translated from the interval [a, b] to \([-\,1,1]\), are close to the roots of the nth-order Legendre polynomial, as would be used in Gaussian quadrature. (The same is true for the corresponding weights.)
We thus choose n and write
where \(k_j\) are the roots and \(w_j\) are the weights, respectively, associated with \(q_n(k)\). In order to use the same approximation for the sum in (15), we consider only values of \(k_\mathrm{in}\) equal to the \(k_j\). As mentioned, these are typically not integers. We refer to them as “virtual degrees”. Thus, our model equations are
for \(j=1,\dots n\). We are interested in fixed points of these equations, and how these fixed points and their stabilities change as parameters such as \(\eta _0\) and \(\hat{\rho }\) are varied. We use pseudo-arclength continuation [7, 11] to investigate this.
In order to calculate the mean frequency of the network, we use the result that the frequency for neurons with in-degree k is [17]
where overline indicates complex conjugate, and then average over the network to obtain the mean frequency
(The normalisation is needed because even though the integral of the joint degree distribution over \([k_\mathrm{in},k_\mathrm{out}]^2\) equals 1, the sum over the corresponding discrete grid does not.)
Typical convergence of a calculation of f with increasing n is shown in Fig. 4 for several sets of parameter values. We see rapid convergence and choose \(n=15\) for future calculations. (Calculations of the form shown in Figs. 5 and 7 were repeated using the full degree sequence from a to b, with essentially identical results.)
5 Results
5.1 Excitatory coupling
We first consider the case of excitatory coupling, i.e. \(K>0\). We expect a region of bistability for negative \(\eta _0\), as seen in Fig. 5. We see that decreasing \(\rho \) moves the curve to the right and vice versa. (\(\hat{\rho }\) was chosen to give these particular values of \(\rho \).) Following the saddle-node bifurcations as \(\rho \) is varied, we obtain Fig. 6.
Given the influence of \(\hat{\rho }\) (and thus \(\rho \)) on Q (see Fig. 3), this result is easy to understand. Neurons with high in-degree fire faster than those with low in-degree, and for positive \(\rho \), high in-degree neurons contribute more to the sum in (41) than for negative \(\rho \). Thus, the total amount of “output” from neurons is higher for positive \(\rho \) and lower for negative \(\rho \). Put another way, with positive \(\rho \), neurons with high firing rate (due to high in-degree) are more likely to have a high out-degree, thus exciting more neurons than would otherwise be the case. Increasing \(\rho \) has the same qualitative effect as increasing the coupling strength K, as observed by [20].
5.2 Inhibitory coupling
Next, we consider inhibitory coupling, with \(K=-\,1\). Average network frequency versus \(\eta _0\) is shown in Fig. 7 for three different values of \(\rho \). We see that increasing \(\rho \) slightly increases the frequency and vice versa. We can also understand this behaviour in a qualitative sense. For inhibitory coupling, neurons with high in-degree are not likely to be firing, so can be ignored. When \(\rho <0\), neurons with low in-degree will have high out-degree, and thus the amount of inhibitory “output” in the network is increased. For positive \(\rho \), neurons with low in-degree will have low out-degree, and thus they will inhibit fewer neurons than in the case of negative \(\rho \), leading to a higher average firing rate.
We performed calculations corresponding to the results shown in Figs. 5 and 7 for networks of theta neurons and found qualitatively, and to a large extent quantitatively, the same behaviour as in those figures (results not shown).
6 More realistic network
To verify the behaviour seen above in a network of theta neurons, we investigated a more realistic network of spiking neurons, in this case Morris–Lecar neurons. For the case of excitatory coupling, the network equations are [29]
where
Parameters are \(V_1=-\,1.2,V_2=18,V_3=12,V_4=17.4,\lambda _0=1/15 \hbox {ms}^{-1},g_L=2, g_K=8, g_\mathrm{Ca}=4, V_L=-\,60, V_\mathrm{Ca}=120, V_K=-\,80, C=20\,\upmu \,\hbox {F}/\hbox {cm}^2,\tau =100,V_\mathrm{ex}=120,\epsilon =5\,\hbox {mS}/\hbox {cm}^2\). Voltages are in mV, conductances are in mS/cm\(^2\), time is measured in milliseconds, and currents in \(\upmu \hbox {A}/\hbox {cm}^2\). In the absence of coupling and heterogeneity, a neuron undergoes a SNIC bifurcation as \(I_0\) is increased through \(\sim 40\). We have used synaptic coupling of the form in [6], but on a timescale \(\tau \) rather than instantaneous as in that paper. The \(I_i\) are randomly chosen from a Lorentzian distribution with mean zero and half-width at half-maximum 0.05.
The network is created as follows, using the Gaussian copula of Sect. 3. For each \(i\in \{1,\dots N\}\), let \(x_1\) and \(x_2\) be independently chosen from a unit normal distribution. Then \(x_1\) and \(y_1=\hat{\rho } x_1+\sqrt{1-\hat{\rho }^2}x_2\) both have unit normal distributions and covariance \(\hat{\rho }\), i.e. are realisations of x and y in (17). We then set \(k_\mathrm{in}^i=C_k^{-1}(C(x_1))\) and \(k_\mathrm{out}^i=C_k^{-1}(C(y_1))\). These degrees each have distribution p(k) but have correlation coefficient \(\rho \), where \(\rho \) is determined by the value of \(\hat{\rho }\), as shown in Fig. 2. We then create the connection from neuron j to neuron i (i.e. set \(A_{ij}=1\)) with probability
where \(\langle k\rangle \) is the mean of the degrees, and \(A_{ij}=0\) otherwise (the Chung–Lu model [2]). Typical results for the network generation are shown in Fig. 8, and the measured correlations are given in the figure. The distributions of the resulting degrees no longer match the distributions of the \(k_\mathrm{in}^i\) and \(k_\mathrm{out}^i\), but are close. We could have used the configuration model to avoid this problem [19], but here we are only interested in qualitative results. Quasi-statically sweeping through \(I_0\) for networks with three different values of \(\rho \), we obtain Fig. 9, in qualitative agreement with Fig. 5. In Fig. 5, there is a region of bistability for each value of \(\rho \), and the region moves to lower average drive as \(\rho \) is increased. Since we cannot detect unstable states through simulation of (44)–(46), this bistability is manifested as jumps from low-frequency to high-frequency branches as \(I_0\) is varied, as seen in Fig. 9.
For inhibitory coupling, we replace \(m_\infty (V_i)\) in (46) by \(w_\infty (V_i)\), replace \(V_{ex}-V_i\) in (44) by \(V_K-V_i\), and choose \(\epsilon =10 mS/cm^2\). Sweeping through \(I_0\) for three different values of \(\rho \), we obtain Fig. 10, in qualitative agreement with Fig. 7.
7 Motifs
A number of authors have found that “motifs” (small sets of neurons connected in a specific way) do not occur in cortical networks in the proportions one would expect by chance [24, 28]. Some theoretical results relating the presence or absence of certain motifs to network dynamics have been obtained [8, 21, 33]. For networks whose generation is described in Sect. 6, we counted the number of order-2 and order-3 motifs (involving two or three neurons, respectively), for negative, zero and positive values of \(\rho \). We compute the frequencies (amount) of order-2 motifs by counting the amount of 0’s, 1’s and 2’s in the upper triangular part of \((A+A^T)\), where A is the adjacency matrix and T means transposed. They refer to unconnected, unidirectional connected and reciprocal connected pairs of neurons, respectively. For all 13 connected order-3 motifs, we used the software “acc-motif” [16]. The remaining three unconnected motifs have been counted by our own algorithm, i.e. looping through all neurons, we create for each a list of disconnected neurons and count among those order-2 motifs. The results are shown in Figs. 11 and 12, where counts are shown relative to the numbers found for \(\rho =0\).
In all motifs with at least one reciprocal connection between two neurons, we see that the number of motifs goes up with positive \(\rho \) and down with negative \(\rho \). This can be understood in an intuitive way: suppose \(0<\rho \) and consider a neuron with a high out-degree. It is likely to connect to a neuron with a high in-degree. But this second neuron will also have a high out-degree and is therefore more likely to connect to the first neuron, which also has a high in-degree, forming a reciprocal connection. Similarly, suppose \(\rho <0\) and consider a neuron with high out-degree. It is likely to connect to a neuron with high in-degree but low out-degree. Thus, it is unlikely that this second neuron will connect back to the first, which has a low in-degree.
8 Conclusion
We have investigated the effects of correlating the in- and out-degrees of spiking neurons in a structured network. We considered a large network of theta neurons, allowing us to exploit the analytical results previously derived by [1], which give dynamics for complex-valued order parameters, indexed by neurons with the same degrees. The states of interest are steady states of these dynamics, and by using a Gaussian copula we were able to analytically incorporate a parameter which controls the correlations between in- and out-degrees. Numerical continuation was then used to determine the effects of varying parameters, particularly the degree correlation. In order to reduce the computational cost, we introduced the concept of “virtual degrees” allowing us to efficiently approximate sums with many terms by sums with fewer terms.
For an excitatory network, we found that increasing degree correlations had a similar effect as increasing the overall strength of coupling between neurons, consistent with the findings of [20, 32]. Our results are also consistent with those of [30], who found that negative correlations stabilised the low firing rate state, as shown in Fig. 5. For inhibitory coupling, we found that increasing degree correlations slightly increased the mean firing rate of the network. Both of these effects were reproduced in a more realistic network of Morris–Lecar spiking neurons.
We also measured the relative frequency of occurrence of order-2 and order-3 motifs as within-degree correlations were varied and found that in all motifs with at least one reciprocal connection between two neurons, the number of motifs is positively correlated with \(\rho \). Several authors have linked motif statistics to synchrony within a network [8, 33]; however, a link between motif statistics and firing rate, as observed here, seems yet to be developed.
We chose a Lorentzian distribution of the \(\eta _i\) in (1), as many others have done [22], in order to analytically evaluate an integral and derive (6). However, we repeated the calculations shown in Figs. 5, 7, 9 and 10 using a Gaussian distribution of the \(\eta _i\) and found the same qualitative behaviour (not shown). Regarding the parameter n governing the sharpness of the function \(P_n(\theta )\), we repeated the calculations shown in Figs. 5 and 7 for \(n=5,\infty \) and obtained qualitatively the same results (not shown). We used a Gaussian copula to correlate in- and out-degrees due to its analytical form, but numerically investigated the scenarios shown in Figs. 5 and 7 for t copulas and Archimedean Clayton, Frank and Gumbel copulas and found the same qualitative behaviour (also not shown).
For simplicity, we used the same truncated power law distribution for both in- and out-degrees. However, the use of a Gaussian copula for inducing correlations between degrees does not require them to be the same, so one could use the framework presented here to investigate the effects of varying degree distributions [26], correlated or not.
We also only considered either excitatory or inhibitory networks, but it would be straightforward to generalise the techniques used here to the case of both types of neuron, with within-neuron degree correlations for either or both populations, though at the expense of increasing the number of parameters to investigate.
References
Chandra S, Hathcock D, Crain K, Antonsen TM, Girvan M, Ott E (2017) Modeling the network dynamics of pulse-coupled neurons. Chaos 27(3):033102. https://doi.org/10.1063/1.4977514
Chung F, Lu L (2002) Connected components in random graphs with given expected degree sequences. Ann. Comb 6(2):125–145
Coombes S, Byrne Á (2019) Next generation neural mass models. In: Nonlinear dynamics in computational neuroscience. Springer, pp 1–16
Eguíluz VM, Chialvo DR, Cecchi GA, Baliki M, Apkarian AV (2005) Scale-free brain functional networks. Phys Rev Lett 94:018102. https://doi.org/10.1103/PhysRevLett.94.018102
Engblom S (2006) Gaussian quadratures with respect to discrete measures. Technical Report, Uppsala University, Technical Report 2006-007
Ermentrout G, Kopell N (1990) Oscillator death in systems of coupled neural oscillators. SIAM J Appl Math 50(1):125–146
Govaerts WJ (2000) Numerical methods for bifurcations of dynamical equilibria, vol 66. Siam
Hu Y, Trousdale J, Josić K (2013) Shea-Brown E (2013) Motif statistics and spike correlations in neuronal networks. J Stat Mech Theory Exp 03:P03012
Kähne M, Sokolov I, Rüdiger S (2017) Population equations for degree-heterogenous neural networks. Phys Rev E 96(5):052306
Laing CR (2014) Derivation of a neural field model from a network of theta neurons. Phys Rev E 90(1):010901
Laing CR (2014) Numerical bifurcation theory for high-dimensional neural models. J Math Neurosci 4(1):1
Laing CR (2016) Bumps in small-world networks. Front Comput Neurosci 10:53
LaMar MD, Smith GD (2010) Effect of node-degree correlation on synchronization of identical pulse-coupled oscillators. Phys Rev E 81(4):046206
Luke TB, Barreto E, So P (2013) Complete classification of the macroscopic behavior of a heterogeneous network of theta neurons. Neural Comput 25:3207–3234
Martens MB, Houweling AR, Tiesinga PH (2017) Anti-correlations in the degree distribution increase stimulus detection performance in noisy spiking neural networks. J Comput Neurosci 42(1):87–106
Meira LAA, Máximo VR, Fazenda AL, Da Conceição AF (2014) Acc-motif: accelerated network motif detection. IEEE/ACM Trans Comput Biol Bioinform 11(5):853–862. https://doi.org/10.1109/TCBB.2014.2321150
Montbrió E, Pazó D, Roxin A (2015) Macroscopic description for networks of spiking neurons. Phys Rev E 5(2):021028
Nelsen RB (2007) An introduction to copulas. Springer, New York
Newman M (2003) The structure and function of complex networks. SIAM Rev 45(2):167–256
Nykamp DQ, Friedman D, Shaker S, Shinn M, Vella M, Compte A, Roxin A (2017) Mean-field equations for neuronal networks with arbitrary degree distributions. Phys Rev E 95(4):042323
Ocker GK, Litwin-Kumar A, Doiron B (2015) Self-organization of microcircuits in networks of spiking neurons with plastic synapses. PLoS Comput Biol 11(8):e1004458
Ott E, Antonsen T (2008) Low dimensional behavior of large systems of globally coupled oscillators. Chaos 18:037113
Ott E, Antonsen T (2009) Long time evolution of phase oscillator systems. Chaos 19:023117
Perin R, Berger TK, Markram H (2011) A synaptic organizing principle for cortical neuronal groups. Proc Natl Acad Sci 108(13):5419–5424. https://doi.org/10.1073/pnas.1016051108
Restrepo JG, Ott E (2014) Mean-field theory of assortative networks of phase oscillators. Europhys Lett 107(6):60006
Roxin A (2011) The role of degree distribution in shaping the dynamics in networks of sparsely connected spiking neurons. Front Comput Neurosci 5:8
Schmeltzer C, Kihara AH, Sokolov IM, Rüdiger S (2015) Degree correlations optimize neuronal network sensitivity to sub-threshold stimuli. PloS One 10:e0121794
Song S, Sjöström PJ, Reigl M, Nelson S, Chklovskii DB (2005) Highly nonrandom features of synaptic connectivity in local cortical circuits. PLoS Biol 3(3):e68
Tsumoto K, Kitajima H, Yoshinaga T, Aihara K, Kawakami H (2006) Bifurcations in morris-lecar neuron model. Neurocomputing 69(4–6):293–316
Vasquez J, Houweling A, Tiesinga P (2013) Simultaneous stability and sensitivity in model cortical networks is achieved through anti-correlations between the in- and out-degree of connectivity. Front Comput Neurosci 7:156
Vegué M, Perin R, Roxin A (2017) On the structure of cortical microcircuits inferred from small sample sizes. J Neurosci 37(35):8498–8510. https://doi.org/10.1523/JNEUROSCI.0984-17.2017
Vegué M, Roxin A (2019) Firing rate distributions in spiking networks with heterogeneous connectivity. Phys Rev E 100:022208. https://doi.org/10.1103/PhysRevE.100.022208
Zhao L, Beverlin BI, Netoff T, Nykamp DQ (2011) Synchronization from second order network connectivity statistics. Front Comput Neurosci 5:28
Acknowledgements
This work is partially supported by the Marsden Fund Council from Government funding, managed by Royal Society Te Apārangi. We thank Andrew Punnett and Marti Anderson for useful conversations about copulas and Shawn Means for comments on the manuscript. We also thank the referees for their helpful comments which improved the paper.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Communicated by Benjamin Lindner.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Laing, C.R., Bläsche, C. The effects of within-neuron degree correlations in networks of spiking neurons. Biol Cybern 114, 337–347 (2020). https://doi.org/10.1007/s00422-020-00822-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00422-020-00822-0