1 Introduction

Neurons transmit and process information by sequences of action potentials (spikes) (Bear et al. 2007; Dayan and Abbott 2001). Spike generation appears often to be well described in a stochastic framework because the neural firing is affected by intrinsic noise source, as, for instance, channel noise (White et al. 2000) and by the massive synaptic input from many other (only weakly correlated) neurons (Destexhe et al. 2003). An important task of computational neuroscience is to identify the way that information about stimuli is encoded in the neural spike train (Rieke et al. 1996). A nontrivial step on this way is to understand the firing statistics first, i.e. to study which spike patterns are possible in the absence or presence of a sensory stimuli and which patterns can be expected due to the aforementioned intrinsic noise or the synaptic background.

One key feature of the firing statistics are correlations among the interspike intervals (ISI) of the spike train. Experimental evidence shows that ISIs are typically correlated over a few lags (Lowen and Teich 1992; Nawrot et al. 2007; Neiman and Russell 2001; Ratnam and Nelson 2000; Engel et al. 2008), and these correlations can have a strong effect on the neural information transmission (Ratnam and Nelson 2000; Chacron et al. 2001; Chacron et al. 2004; Avila-Akerberg and Chacron 2011). A number of mechanisms are known for inducing ISI correlations, such as correlated stimuli (so-called colored noise) (Middleton et al. 2003; Lindner 2004), or intrinsic noise from ion channels with slow kinetics (Fisch et al. 2012). Another prevalent mechanism for ISI correlations are the slow feedback processes mediating spike-frequency adaptation (Treves 1993; Chacron et al. 2000; Liu and Wang 2001; Benda et al. 2005), a phenomenon describing the reduced neuronal response to slowly changing stimuli (Benda and Herz 2003; Gabbiani and Krapp 2006). More specifically, the negative feedback from both subthreshold and spike-triggered adaptations impact the timing of neuronal spiking (Benda and Herz 2003; Benda et al. 2010; Gabbiani and Krapp 2006; Ladenbauer et al. 2012; Schwalger and Lindner 2013). The correlations of adapting neurons can show qualitatively different patterns, ranging from monotonically decaying to damped oscillatory correlations (Chacron et al. 2004; Chacron et al. 2001; Ratnam and Nelson 2000; Schwalger et al. 2010). Two of us have recently put forward a simple theory for ISI correlations of integrate-and-fire (IF) neurons with pure spike-triggered adaptation (Schwalger and Lindner 2013) that allowed us to relate features of the neural dynamics to various correlation patterns. However, the model in this study did not include subthreshold adaptation, that is relevant for many adaptation currents. How subthreshold adaptation affects ISI correlations is still poorly understood from the theoretical point of view and it is the central question of our study.

In this article, we study the popular two-dimensional adaptive exponential integrate-and-fire (aEIF) model, proposed by Brette and Gerstner (2005) as a combination of the exponential integrate-and-fire (EIF) model (Fourcaud-Trocmé et al. 2003) and a second slow variable as in Izhikevich’s model (Izhikevich 2003). Besides the often studied form of spike-triggered adaptation, the system is also equipped with a subthreshold coupling between adaptation variable and voltage, which endows the model with much richer dynamical behavior and enables complex spiking patterns as worked out in detail by Touboul and Brette (2008) (it shares this rich repertoire of firing patterns with the earlier model by Izhikevich (2003)). We consider a stochastic version of the model that includes a white Gaussian current noise, mimicking in a simple way the fluctuations mentioned above. The advantages of the aEIF model - as opposed to other two-dimensional models of similar complexity - are the exponential spike initiation similar to that seen in real action potentials and a direct relation of its parameters to physiological ones. Moreover, the deterministic aEIF model exhibits a wide range of firing patterns (Clopath et al. 2007) that can be tuned to reproduce the behavior of all major classes of neurons. Remarkably, it has been successfully fit to Hodgkin-Huxley-type neurons (Brette and Gerstner 2005) as well as to recordings from cortical neurons (Jolivet et al. 2008).

We derive an analytical expression for the serial correlation coefficient (SCC) and the coefficient of variation (CV) in the mean-driven regime and under the assumption of weak noise. This expression is based on the characteristics of the deterministic aEIF model such as its phase response curve (PRC) and its Green’s functions. We demonstrate that this theory correctly predicts the correlation pattern and coefficient of variation up to CVs of 0.2. Furthermore, we show that a weak and slow subthreshold adaptation has a weak effect on the correlation pattern. Most importantly, we show that strong pure subthreshold adaptation in the aEIF model results in positive correlations, that decay monotonically with the lag. This is in marked contrast to the pronounced negative correlations between adjacent ISIs, that have been so far regarded as the main effect of adaptation currents in stochastic neuron models.

Our paper is organized as follows. In the methods section, we introduce the model and statistics that are used in the paper and discuss the dynamical behavior of the model for strong mean input current. We then sketch the derivation of our main results. Theoretical predictions for the CV and SCC are compared to simulations of the models at different levels of noise and for the cases of pure subthreshold adaptation and spike-triggered adaptation, respectively. Furthermore, we inspect how the interspike interval correlations change upon varying the constant input current or the adaptation time scale. We conclude by a discussion of our findings in the biological context of neural signal transmission. Details of the deterministic dynamics (phase response curves, Green’s functions) can be found in the Appendix.

2 Methods

2.1 Neuron model

We consider the adaptive exponential integral-and-fire (aEIF) neuron model (Brette and Gerstner 2005; Touboul and Brette 2008) in a normalized setup similar to that studied in Schwalger and Lindner (2013)

$$\begin{array}{@{}rcl@{}} \dot v & = & f(v) + \mu -a +\xi(t), \end{array} $$
(1)
$$\begin{array}{@{}rcl@{}} \tau_{a} \dot a & = & -a +A v+\tau_{a} {\Delta} \sum\limits_{i} \delta (t-t_{i}), \end{array} $$
(2)

Here time is measured in multiples of the membrane time constant. Equation (1) describes the evolution of the membrane equation, which is determined by the constant mean input current μ, a white noise current of vanishing mean value and correlation function \(\langle \xi (t)\xi (t^{\prime }) \rangle =2D \delta (t-t^{\prime })\) (δ(t) is the Dirac delta function and D is the noise intensity). The nonlinear function

$$ f(v)= - v+{\Delta}_{T}\exp{[(v-1)/{\Delta}_{T}]}{.} $$
(3)

includes a leak term and an exponential function (approximating a persistent sodium current) with Δ T being the slope factor quantifying the sharpness of the spike. Our choice of a vanishing leak reversal potential means that our voltage variable measures the difference to the leak reversal potential and that it is measured in multiples of the difference between leak reversal and the voltage at which the exponent of the exponential function switches its sign. As usual for integrate-and-fire models, upon reaching the threshold v t h =1.5, we register a spike time t i and reset the voltage to v r =0. For the remaining parameters of the voltage equation we choose μ=25, Δ T =0.2, and τ a =5.

The variable a≥0 enters Eq. (1) as a cell-intrinsic (non-synaptic) hyperpolarizing current and is governed by Eq. (2). The dynamics of a is affected by the voltage in two ways. First, the spikes generated by the aforementioned threshold criterion lead to an abrupt increase by Δ of the adaptation variable, the so-called spike-triggered adaptation corresponding to the last term on the r.h.s. in Eq. (2). Secondly, the subthreshold voltage v affects the adaptation variable also if no spike is generated. The strength of this subthreshold adaptation is quantified by parameter A. For simplicity, we adopt the assumption that the potential around which the subthreshold adaptation is linearized is equal to the leak potential (Brette and Gerstner 2005); any other value can be lumped into the base current by a simple transformation.

2.2 Behavior of the model with strong mean input current

In the absence of noise (D=0), for sufficiently large input current μ and not too strong subthreshold adaptation, the aEIF neuron exhibits periodic spiking (Touboul and Brette 2008; Naud et al. 2008) (after transients due to initial conditions have decayed). In this case, there exists a deterministic limit cycle (v 0(t),a 0(t)) with a stable period T (i.e. ISI) and a constant value \(a^{*}=a\left (t_{i}^{+}\right )\) (here \(a\left (t_{i}^{+}\right )\) denotes the value immediately after the spike time t i ). Such a limit cycle for a large values of μ and A is shown in Fig. 1a. The figure also illustrates the existence of a fixed point, which is, however, unstable.

Fig. 1
figure 1

Phase-space trajectories for different strength of subtreshold adaptation. (a): For a moderate value of A = 22 a stable limit cycle (thick solid line) is the only attractor in the system. Small noise (D = 10−3 in all panels) will lead to small deviations from the limit cycle, that are described faithfully by our perturbation calculation. (b): For a larger value of A = 28, the stable limit cycle coexists with a stable fixed point (target of the thick dark grey trajectory). The fixed point is given by the intersection of the null clines (dotted lines in all panels) for the voltage, \(a_{\dot {v}=0}(v)=f(v)+\mu \), and the adaptation variable, \(a_{\dot {a}=0}(v)=Av\). The domain of attraction for the fixed point is rather small: the light grey trajectory starts close to the initial point of the dark grey trajectory but ends up on the limit cycle. For long simulations started close to the limit cycle, the ISI statistics can still be described by our theory (which is based on perturbations around the limit cycle) because noise is too weak to observe any transitions to the fixed point attractor. (c): For A = 31.5 and, hence, beyond a critical value of A, the limit cycle solution is not stable anymore and only the stable fixed point exists. This excitable regime is clearly outside the scope of our analytical approximation. In all panels: Δ = 2, μ = 25

If we now increase the strength of subthreshold adaptation, the fixed point becomes stable via a subcritical Andronov-Hopf bifurcation. A typical trajectory for this case is shown in Fig. 1b and displays the bistability between the stable limit cycle (red) and the stable focus (blue). Both attractors are separated by an unstable limit cycle. For weak noise (e.g. D=0.001 as used in the simulation of Fig. 1) transitions between the two states do practically not occur in a finite simulation time. For larger noise, switchings between the two states may result in a high variability of the interspike interval. In the following, we will mark parameter regions of bistability by a grey background in all plots where appropriate. For numerical simulations in this parameter regime we will always start the system close to the limit cycle (in the tonically firing state). Finally, if we go with the subthreshold adaptation parameter beyond a critical value (e.g. by choosing A=31.5 as in Fig. 1c), the stable focus is the only remaining stable attractorFootnote 1 in the system (cf. Fig. 1c).

For our analytical approximation we will assume throughout the paper, that the system is kept close to the limit cycle and that noise is so weak that it changes the mean ISI, compared to the deterministic case, only little. We will see that our theory (which is entirely based on calculations of perturbations around the limit cycle) works well also in the region of bistability as long as we choose initial conditions close to the limit cycle.

2.3 Statistics of interest

Subsequent spike times of the model define the interspike intervals (ISIs) T i =t i t i−1. The firing rate of the neuron is given by the inverse of the mean interval

$$ r=\frac{1}{\langle T_{i} \rangle} $$
(4)

where the average \(\langle {\dots } \rangle \) is taken over the sequence of ISIs. It is instructive to relate time scales and rates in our non-dimensional model to real-time values. With a membrane time constant of τ m =20ms, values of τ a =5 and r=1 correspond to an adaptation constant of 100ms and a firing rate of 50Hz.

The variability of the intervals can be quantified by the coefficient of variation (CV), the ratio of the intervals’ standard deviation and its mean:

$$ C_{v}=\frac{\sqrt{\langle (T_{i}-\langle{T_{i}}\rangle)^{2} \rangle}}{\langle T_{i}\rangle} $$
(5)

The CV is zero for a strictly periodic spike train as, for instance, in our system generated in the absence of noise (D=0). The CV of a completely random spike train, a Poisson process, is one.

The statistics of central interest in our study is the serial correlation coefficient (SCC), given by

$$ \rho_{k}=\frac{\langle (T_{i}-\langle{T_{i}}\rangle)(T_{i+k}-\langle{T_{i+k}}\rangle) \rangle }{\langle{(T_{i}-\langle{T_{i}}\rangle)^{2}}\rangle}, $$
(6)

where T i and T i+k are two ISIs, lagged by an integer k. A positive coefficient ρ 1 implies that adjacent intervals T i and T i+1 show (on average) similar deviations from the mean, i.e. they are both shorter or both longer than 〈T i 〉. In contrast, negative correlations between adjacent intervals mean that a short interval is followed by a long one and/or vice versa. Typically, whatever the sign of the correlation is, they become smaller if we consider intervals that are further apart; in particular, \(\lim \limits _{k\to \infty } \rho _{k}=0\).

We note that if the system operates in the bistable regime of coexistence of limit cycle and stable focus, our statistics is, strictly speaking, not stationary but conditioned on the initial values of v and a, which are chosen close to the stable limit cycle. The problem of the dwell times in either states and their effect on the ISI statistics is an interesting one but beyond the scope of this article.

2.4 Calculation of the SCC ρ k

With weak noise input, both voltage v(t) and adaptation variable a(t) will deviate from the deterministic limit cycle, deviations that we denote by δ v(t)=v(t)−v 0(t) and δ a(t)=a(t)−a 0(t), respectively (here time t starts at a spike time t i on the voltage reset). Obviously, also the roundtrip time is not fixed anymore but characterized by small deviations δ T i =T i T . While the voltage always starts with the same value, there is a variability in the starting points of \(\delta a_{i}=\delta a(t_{i}^{+})=a_{i} -a^{*}\). It is exactly this initial value in a(t) that carries memory about previous ISIs and hence leads to correlations among the ISIs.

The weak deviation of the ISI, δ T i+1 can be related to the perturbations by the noise and by the initial perturbation of the adaptation variable via the infinitesimal phase response curves (PRCs) of the system:

$$ \delta{T_{i+1}}= - {\int}_{0}^{T^{*}}\! dt\; Z(t) \xi(t_{i}+t) - \delta{a_{i}} Z_{a}(0). $$
(7)

The PRCs Z(t) and Z a (t) can be obtained through the adjoint equations of the linearized neuron model (see Appendix). In Eq. (7), the knowledge of Z(t) suffices, because \(Z_{a}(0)=-{\int }_{0}^{T^{*}}\!ds\; Z(s)e^{-s/\tau _{a}}\). We can rewrite the above equation as follows

$$ \delta{T_{i+1}}=- Z_{a}(0)\delta{a_{i}}-\xi_{i}, $$
(8)

where \(\xi _{i}={\int }_{0}^{T^{*}}\! dt\; Z(t) \xi (t_{i}+t)\) is a sequence of uncorrelated Gaussian random numbers with correlation function

$$ \langle \xi_{i} \xi_{i+k} \rangle=2D \delta_{k,0} {\int}_{0}^{T^{*}} \! ds\; Z^{2}(t), $$
(9)

where δ i,j is the Kronecker delta. A useful relation for the deviation δ a i is based on the formal integration of Eq. (2) over one ISI, connecting \(a_{i+1}=a\left (t_{i+1}^{+}\right )\) and \(a_{i}=a\left (t_{i}^{+}\right )\)

$$ a_{i+1}=a_{i} e^{\frac{-T_{i+1}}{\tau_{a}}}+\frac{A}{\tau_{a}}{\int}_{t_{i}}^{t_{i}+T_{i+1}} dt^{\prime} v(t^{\prime}) e^{\frac{-(t_{i}-t^{\prime})}{\tau_{a}}}e^{\frac{-T_{i+1}}{\tau_{a}}}+{\Delta}. $$
(10)

Without noise and under the assumption made above, this adaptation map (Touboul and Brette 2008) will approach a steady state, which is formally given by

$$ a^{*}=\frac{\Delta+(A/\tau_{a})e^{-T^{*}/\tau_{a}}{\int}_{0}^{T^{*}} \! ds\; v_{0}(s) e^{s/\tau_{a}}}{1-\exp[-T^{*}/\tau_{a}]}. $$
(11)

This is not a closed-form solution, because it still contains the period T of the deterministic limit cycle and the voltage trajectory v 0(t), which also depend on a .

In the presence of noise (D>0), Eq. (10) can be linearized with respect to the small deviations δ v(t), δ a i , and δ T i results in

$$\begin{array}{@{}rcl@{}} \delta{a_{i+1}} &=& \alpha \delta{a_{i}} -\beta\delta{T_{i+1}} + \alpha\frac{A}{\tau_{a}}{\int}_{0}^{T^{*}}\!\!ds\;\delta v(s)\;e^{s/\tau_{a}} \end{array} $$
(12)

with

$$\begin{array}{@{}rcl@{}} \alpha &=&\exp[-T^{*}/\tau_{a}], \end{array} $$
(13)
$$\begin{array}{@{}rcl@{}} \beta &=& a^{*}\frac{\alpha}{\tau_{a}} +A\frac{\alpha}{\tau_{a}} \left[ \frac{1}{ \tau_{a}}{\int}_{0}^{T^{*}}\!\!ds\; v_{0}(s)\;e^{s/\tau_{a}} -\frac{{v_{th}}}{\alpha} \right]. \end{array} $$
(14)

Note that besides the obvious dependence on the amplitude A in these equations, the deterministic limit cycle (v 0(t),a 0(t)) and its period also depend on the strength of subthreshold and spike-triggered adaptation, A and Δ, respectively.

The last integral term in Eq. (12) can be expressed via the Green’s function δ X g (T ,s) of the perturbation dynamics from Eqs. (1)-(2) (see Appendix), as

$$\begin{array}{@{}rcl@{}} \frac{\alpha}{\tau_{a}}{\int}_{0}^{T^{*}}\!\!ds\;\delta v(s)\;e^{s/\tau_{a}} &\,=\,&-\delta a_{i}{\int}_{0}^{T^{*}} \!\! ds\; \delta X_{g}(T^{*},s) \; e^{-s/\tau_{a}}\\ &&+\!{\int}_{0}^{T^{*}}\!\!ds \;\delta X_{g}(T^{*},s)\;\xi(t_{i}\,+\,s). \end{array} $$
(15)

Combining this relation with Eq. (12) and Eq. (7), we obtain a stochastic map for δ a i :

$$ \delta{a_{i+1}} = \alpha \theta \cdot \delta{a_{i}} + \eta_{i} $$
(16)

where

$$ \theta= 1+\frac{\beta Z_{a}(0)}{\alpha} -\frac{A}{\alpha}{{\int}_{0}^{T^{*}}\! ds \;\delta X_{g}(T^{*},s) e^{-s/\tau_{a}}} $$
(17)

(where we used that \(Z_{a}(0)=-{\int }_{0}^{T^{*}} ds \exp (-s/\tau _{a}) Z(s)\), see Appendix) and

$$ \eta_{i}= {\int}_{0}^{T^{*}} \!\!ds\; (\beta Z(s) +A\delta X_{g}(T^{*},s))\;\xi(t_{i}+s) $$
(18)

are Gaussian random numbers with correlation function

$$ \langle \eta_{i} \eta_{i+k} \rangle=2D \delta_{k,0} {\int}_{0}^{T^{*}} \!\!ds\; (\beta Z(s) +A\delta X_{g}(T^{*},s))^{2} $$
(19)

The stochastic map Eq. (16) can be formally solved [by recursively applying Eq. (16)], yielding

$$ \delta a_{i+k}=(\alpha {\theta})^{k} \delta a_{i} +\sum\limits_{j=0}^{k-1} (\alpha {\theta})^{j} \eta_{i+k-1-j}, $$
(20)

from which by multiplication with δ a i and averaging, we obtain

$$ \langle \delta a_{i+k} \delta a_{i} \rangle=(\alpha {\theta})^{k} \langle \delta {a_{i}^{2}} \rangle= (\alpha{\theta})^{k} \frac{\langle {\eta_{i}^{2}} \rangle}{1-(\alpha{\theta})^{2}} $$
(21)

(the last relation follows from squaring both sides of Eq. (16) and averaging).

We are ultimately interested in the correlations of the ISIs, which can be in the stationary case expressed according to Eq. (9) as

$$\begin{array}{@{}rcl@{}} \langle \delta T_{i+k} \delta T_{i} \rangle&=&\langle \delta T_{i+k+1} \delta T_{i+1} \rangle\\ &=&\langle (Z_{a}(0)\delta a_{i+k}+\xi_{i+k})(Z_{a}(0)\delta a_{i}+\xi_{i}) \rangle, \end{array} $$
(22)

from which we get

$$\begin{array}{@{}rcl@{}} \langle \delta T_{i+k} \delta T_{i} \rangle&=&{Z_{a}^{2}}(0) (\alpha {\theta})^{k} \frac{\langle {\eta_{i}^{2}} \rangle}{1-(\alpha{\theta})^{2}}\\ &&+Z_{a}(0) (\alpha {\theta})^{k-1}(1-\delta_{k,0}) \langle \xi_{i} \eta_{i} \rangle\\ &&+2D \delta_{k,0} \langle {\xi_{i}^{2}} \rangle. \end{array} $$
(23)

From this relation, the variance of the ISI (k=0) and the covariance can be determined and from these statistics we can calculate the CV and the SCC. Because the formulas are lengthy, a thoughtful choice of auxiliary expressions is helpful and convenient for checking limit cases.

For the squared CV we obtain

$$ {C_{v}^{2}}=\frac{2 D}{(T^{*})^{2}}\left(\bar{E}+\frac{\hat{\beta}}{1-(\alpha{\theta})^{2}} \bar{G}\right) $$
(24)

and for the SCC with k>0 we get

$$ \rho_{k}=\varphi^{k-1} \frac{(1-\varphi^{2})\hat{\beta}\bar{F}+\varphi \hat{\beta}^{2} \bar{G}}{(1-\varphi^{2})\bar{E}+\hat{\beta}^{2} \bar{G}}. $$
(25)

In these formulas we used the abbreviations:

$$\begin{array}{@{}rcl@{}} \hat{\beta}&=&e^{-T^{*}/\tau_{a}} \frac{Z_{a}(0)}{\tau_{a}}\left[a^{*}+A\left({\int}_{0}^{T^{*}} \frac{ds}{\tau_{a}} v_{0}(s) e^{s/\tau_{a}}\right.\right.\\ &&\left.\left.-{v_{th}}e^{T^{*}/\tau_{a}}\vphantom{{\int}_{0}^{T^{*}}}\right)\right], \end{array} $$
(26)
$$\begin{array}{@{}rcl@{}} \varphi&=& e^{-T^{*}/\tau_{a}}+\hat{\beta} -A{\int}_{0}^{T^{*}}\! ds \;e^{-s/\tau_{a}} \delta X_{g}(T^{*},s), \end{array} $$
(27)
$$\begin{array}{@{}rcl@{}} \bar{E}&=& {\int}_{0}^{T^{*}} \! ds\; Z^{2}(s), \end{array} $$
(28)
$$\begin{array}{@{}rcl@{}} \bar{F}&=&{\int}_{0}^{T^{*}} \! ds\; Z(s)\left(Z(s)+\frac{A}{\hat{\beta}} Z_{a}(0)\delta X_{g}(T^{*},s)\right) , \end{array} $$
(29)
$$\begin{array}{@{}rcl@{}} \bar{G}&=&{\int}_{0}^{T^{*}} \! ds\; \left(Z(s)+\frac{A}{\hat{\beta}} Z_{a}(0)\delta X_{g}(T^{*},s)\right)^{2}. \end{array} $$
(30)

The values of T , a and Z a (0) and the functions v 0(s),Z(s), and δ X g (s) can be determined numerically from the deterministic system; see Appendix.

In the limit case of A=0 (pure spike-triggered adaptation), \(\bar {E}=\bar {F}=\bar {G}\) and \(\hat {\beta }=\varphi -e^{-T^{*}/\tau _{a}}\) (here Eq. (17) has been used). Using \({\theta }=\varphi /\alpha =\varphi e^{T^{*}/\tau _{a}}\), the SCC in this case reads

$$ \rho_{k}=-(\alpha{\theta})^{k-1}\frac{\alpha (1-{\theta})(1-\alpha^{2}{\theta})}{1+\alpha^{2}-2\alpha^{2}{\theta}}, $$
(31)

which agrees with the result by Schwalger and Lindner (2013).

3 Results

Our general result for the SCC has the form

$$ \rho_{k}=a \cdot b^{k-1}, k\ge 1. $$
(32)

Although the evaluation of the two parameters a and b requires numerical computations for the deterministic (noiseless) system, the result Eq. (32) allows already a number of insights without any numerical work.

First of all, the dependence of the serial correlation coefficient on the lag k is that of a geometric series. This is a remarkably simple result, which constrains possible SCC patterns, that can occur in this model. Compared to the previous result on pure spike-triggered adaptation (Schwalger and Lindner 2013), it also demonstrates that subthreshold adaptation does not change the principle mathematical structure of the solution. Secondly, the parameters a and b may attain different signs, which lead to different correlation patterns. For b>0 the possible correlation patterns are a monotonic reduction of initially positive (a>0) or negative (a<0) correlations. For b<0 the correlations oscillate with increasing lag, starting with a positive (a>0) or negative (a<0) correlation coefficient at lag one. As the numerical evaluation below reveals, in the presence of both subthreshold and spike-triggered adaptation, indeed all four correlations patterns can be found by varying parameters. Note that for the case of pure spike-triggered adaptation in an exponential integrate-and-fire neuron, only b<0 is possible, corresponding to only two of the above four cases (Schwalger and Lindner 2013). Hence, although subthreshold adaptation does not change the geometric dependence on the lag, correlation patterns nevertheless change qualitatively under subthreshold adaptation because positive correlations become possible.

In the following we compare the theoretical results for the CV and the correlation coefficients to results of stochastic simulations of Eqs. (1) and (2). For completeness we will also compare the firing rate with its deterministic limit, i.e. the inverse of T , the limit cycle period of the noiseless system.

3.1 Possible correlation patterns

In Fig. 2 we compare the correlation coefficient ρ k as a function of the lag to numerical simulations. We do so for various combinations of the subthreshold adaptation parameter A and the spike-triggered adaptation parameter Δ, including the cases of a pure subthreshold adaptation (Δ=0, first row in Fig. 2) and pure spike-triggered adaptation (A=0, leftmost column in Fig. 2). All other parameters are fixed except for the noise intensity. We choose different noise intensities for different values of A (different columns in Fig. 2) because it is the small parameter of our theory. How small this parameter must be, depends strongly on the other parametersFootnote 2. Note that with our choice of parameters we condone that the model is not for all combinations of A and Δ in the physiological regime. If adaptation is switched off (i.e. for A=0,Δ=0) we observe firing rates r>10, which, assuming τ m =20ms, correspond to rates in real time that are larger than 500Hz. More physiologically reasonable parameter changes will be discussed below.

Fig. 2
figure 2

Correlation patterns induced by spike-triggered and subthreshold adaptation. Comparison of theory (lines) and simulation results (symbols). Varied are the values of Δ (different rows) and A (different columns). For the different columns the value of the noise intensity has been adjusted as indicated. The grey shading indicates bistability of the deterministic system, i.e. for these parameters a stable focus point and the limit cycle coexist. Only perturbations of the limit cycle, however, are taken into account by our theory and play a role at the small noise intensities used in the simulations

The simple parameter variation in Fig. 2 illustrates several points we want to make. First of all, if the noise level is appropriately adjusted, our theory (lines) can reproduce the simulation results (symbols) for a wide range of parameter values for the subthreshold and spike-triggered adaptation. The result of our calculation for perturbations around the limit cycle are valid in the tonic firing regime (all panels in Fig. 2 with white background) but also in the case of bistability between focus point and limit cycle (panels in Fig. 2 with grey background) if trajectories are started close to the limit cycle. Generally, as a rule of thumb, our theory works well for CVs below 0.2.

Secondly, as hypothesized above we can observe all kinds of correlation patterns that are possible according to Eq. (32). Starting with the trivial case of the renewal process in Fig. 2-a1 in absence of both adaptation mechanisms (A=0,Δ=0), we see in the leftmost column the patterns for pure spike-triggered adaptation (Schwalger and Lindner 2013): a negative SCC at lag one that decays monotonically for small Δ, (e.g. Fig. 2-c1) or in an oscillatory fashion for larger values of Δ (c.f. Fig. 2-d1). For pure subthreshold adaptation (Δ=0, first row in Fig. 2), the neuron displays positive correlations at all lags. This is in marked contrast to the negative correlations induced by spike-triggered adaptation. Positive correlations can be as strong as ρ 1=0.4 for larger values of A (Fig. 2-a4). Positive correlations can be even enhanced if weak spike-triggered adaptation is added (compare e.g. Fig. 2-a3 and b3). By a combination of both spike-triggered and subthreshold adaptation, we can furthermore observe oscillatory correlations that start with ρ 1>0 (c.f. Fig. 2-c4). Finally, if both spike-triggered and subthreshold adaptation are very strong, serial correlations are suppressed (c.f. Fig. 2-d4). This can be explained as follows. Due to a strongly reduced firing rate, the adaptation process becomes fast compared to each single ISI — a limit, in which we can expect the spike statistics of a renewal process.

Although positive correlations in our examples occur if the limit cycle coexists with the stable fixed point (as indicated by the grey background of the plots and as illustrated in Fig. 1b), such bistability is not a necessary condition for positive ISI correlations. Below we give a counter example.

In order to illustrate the interval statistics, we show in Fig. 3 time courses of the voltage and the adaptation variable and also plot the interspike interval sequence. We use two selected parameter sets from Fig. 2, namely, the case of strong pure spike-triggered adaptation (Fig. 2-d1) in (a) and that of strong subthreshold adaptation (Fig. 2-c3) in (b). In the former case, we observe a rather noisy voltage trace and an almost linear decay of the adaptation variable and we see explicitly how the ISIs deviate in an alternating fashion from the mean ISI. With a pronounced alternation, we can expect that adjacent intervals and intervals with an even number of intervals between them become negatively correlated and intervals with an odd number of intervals between them become positively correlated, i.e. we can expect a correlation coefficient that oscillates with the lag. For strong subthreshold adaptation with weaker noise shown in Fig. 3b, the voltage time series is more regular and the time course of the adaptation variable is nonmonotonic even for subthreshold voltage. This is due to the strong coupling between voltage and adaptation for large values of A and leads to pronounced positive correlations.

Fig. 3
figure 3

Time courses of membrane voltage and adaptation variable in two distinct cases with pronounced interspike interval correlations. SCC patterns become apparent by plotting the ISI sequence (solid line) and its mean value (dashed) versus time (top); voltage (middle) and adaptation variable (bottom) are shown as well. Parameters are as in Fig. 2-d1 (for panel a) and in Fig. 2-c3 (for panel b)

3.2 Effects of increasing the subthreshold adaptation on the firing statistics

Above we have verified that our theory leads to correct results, if the noise intensity is appropriately adjusted. Now, we are particularly interested in the effect of increasing the subthreshold adaptation without changing any other parameters. We do this for two different values of the spike-triggered adaptation: Δ=0 in Fig. 4a to observe the way firing statistics changes in the case of pure subthreshold adaptation, and Δ=5 in Fig. 4b to observe the way subthreshold adaptation affects the correlation statistics with a pronounced spike-triggered adaptation. There is a natural limit for increasing A, because, as discussed above in Section 2.2, eventually the system will reach an excitable regime.

Fig. 4
figure 4

Firing statistics as a function of strength of subthreshold adaptation. From top to bottom: firing rate, coefficient of variation of the interspike interval, serial correlation coefficient at lag one and at lag two. For all data we compare again theory (lines) to simulation results (symbols) obtained with fixed noise intensity as indicated. Note that in our theory only the CV depends on the noise intensity (the other panels have only one theory curve). Strength of spike-triggered adaptation is Δ=0 in (a) [pure subthreshold adaptation] and Δ = 5 in (b). The range of A for which the deterministic dynamics displays bistability between limit cycle and stable focus is marked in grey

It becomes evident from Fig. 4a that a weak subthreshold adaptation alone (Δ=0,A<10) affects the firing rate and the CV only little and does not cause strong ISI correlations. Larger values of A induce purely positive correlations and lead to a strong increase in the CV. The comparison with our theory reveals that values of the CV up to 0.2 are well reproduced by Eq. (24).

With finite spike-triggered adaptation (Δ=5 in Fig. 4b), there is a sizeable effect on the ISI correlations already at small values of A (below 10). The correlation coefficient at lag two, ρ 2, for instance, increases by 40 % when we increase A from 0 to 10. Upon further increase of A both ρ 1 and ρ 2 change sign (as already discussed in the context of Fig. 2). The change takes place about A=20, at which point the interval statistics may look like a renewal process. Interestingly, in the case of finite spike-triggered adaptation, the CV becomes a nonmonotonic function of the subthreshold adaptation strength; it displays a shallow minimum as a function of A.

The agreement of our theory with the simulation results is good at all inspected noise levels except if the system is deep in the bistable regime (shaded regions in Fig. 4), where finite noise may cause transitions between the limit cycle to the stable focus fixed point. Such bistability in the firing pattern will lead to strong variability of the ISI, which is reflected in large values of the CV. Because ISI correlations are solely due to perturbations of the tonic firing, the increased ISI variability will also lead to a reduction of the serial correlation coefficient (which is normalized by the ISI variance). This drop in ρ k is indeed seen in all situations where the CV deviates strongly from the predictions of our limit-cycle-based theory.

As mentioned above, the high mean input used in Figs. 2 and 4 allowed us to detect all possible ISI correlation patterns by adjusting only the strength of spike-triggered adaptation and subthreshold adaptation. However, the strong mean drive also leads for some combinations of Δ and A to unphysiological values of the firing rate. In Fig. 5 we adjust the input current μ for every value of A such that we keep the firing rate at r=0.6 (with τ m =20ms this corresponds to a physiologically plausible 30Hz firing rate in real time). Note that the constraint of fixed firing rate also influences the range of possible A values. We compare again our theory to numerical simulations for pure subthreshold adaptation, Δ=0, in Fig. 5a and for mixed spike-triggered and subthreshold adaptation, Δ=5, in Fig. 5b and find in both cases a good agreement, in particular, for low noise intensities. While with fixed firing rate, the CV varies only little with A, the correlation coefficients depend strongly on its choice. Generally, increasing A causes a shift towards positive ISI correlations for both values of Δ. Remarkably, in this scaling it becomes evident in Fig. 5a that purely positive ISI correlations can be also observed outside the bistable parameter range of A (shaded parameter range), i.e. in the tonic firing regime (regions with white background in Fig. 5) that our theory was originally designed for. Hence, positive ISI correlations as predicted by our theory do not hinge on the presence of a stable focus in the deterministic dynamics.

Fig. 5
figure 5

Firing statistics as a function of strength of subthreshold adaptation with a fixed firing rate. From top to bottom: firing rate (kept at 0.6), coefficient of variation of the interspike interval, serial correlation coefficient at lag one and at lag two. For all data we compare again theory (lines) to simulation results (symbols) obtained with fixed noise intensity as indicated. Note that in our theory only the CV depends on the noise intensity (the other panels have only one theory curve). Strength of spike-triggered adaptation is Δ=0 in (a) [pure subthreshold adaptation] and Δ=5 in (b). The range of A for which the deterministic dynamics displays bistability between limit cycle and stable focus is marked in grey

The simplest way in which the firing statistics can be changed experimentally is to inject a constant current and to vary its value. In our setup this corresponds to a variation of the parameter μ, illustrated in Fig. 6. Here we inspect two cases of weak (a) or moderate adaptation (b). In both cases, an increase in μ causes an increase in firing rate and a reduction of variability, which is in line with previous findings for white-noise driven integrate-and-fire neurons (Vilela and Lindner 2009). An increasing value of constant current leads to strong negative correlations between adjacent ISIs if A is small and Δ is large (Fig. 6a) and yields a transition from positive to negative correlations if the subthreshold component of adaptation is more pronounced (Fig. 6b). In particular, if the latter transition would be observed in a real neuron, this would suggest that both spike-triggered and subthreshold adaptation are present in this cell.

Fig. 6
figure 6

Firing statistics as a function of input current. From top to bottom: firing rate, coefficient of variation of the interspike interval, serial correlation coefficient at lag one and at lag two. For all data we compare again theory (lines) to simulation results (symbols) obtained with fixed noise intensity as indicated. Parameters are A=10,Δ=5,τ a =5, varying μ within [10,28] in (a), and Parameters are A=15,Δ=2,τ a =3, varying μ within [16,28] in (b), and all others remain the same

Finally, we demonstrate in Fig. 7 that our theory works well for arbitrary values of the adaptation time constant τ a by varying it over four orders of magnitude, i.e. our approach does not require a slow adaptation variable. First of all, ISI correlations vanish for τ a →0 because the adaptation variable decays too quickly to keep any memory about previous ISIs. In the opposite limit \(\tau _{a} \to \infty \), correlations between adjacent intervals approach the value −1/2 while higher correlations decay again. On physical grounds the cumulative correlations \(\sum \nolimits _{j=1}^{\infty }\rho _{j}\) cannot become larger than −1/2 and, hence, adjacent intervals become perfectly anticorrelated if the adaptation time constant grows without bound. This is, however, also accompanied by a strong reduction of the firing rate.

Fig. 7
figure 7

Firing statistics as a function of the adaptation time constant. From top to bottom: firing rate, coefficient of variation of the interspike interval, serial correlation coefficient at lag one and at lag two. For all data we compare again theory (lines) to simulation results (symbols) obtained with fixed noise intensity as indicated. Remaining parameters are A = 15, Δ = 2, μ = 100

4 Summary and discussion

In this paper we have derived formulas for the CV and SCC of a white-noise driven exponential integrate-and-fire model with both spike-triggered and subthreshold adaptation. Our general result predicts a geometric dependence of the SCC on the lag. Furthermore, according to our theory by varying the adaptation parameters, we can achieve all possible correlation patterns with positive and negative correlations at lag one and a monotonic decay or a damped oscillation with respect to the lag. By comparison with extensive numerical simulations, we have demonstrated that our theory indeed reliably predicts all of these correlation patterns if the neuron is kept in the mean-driven tonic firing regime and if noise is weak. Moreover, our theory works as well in the regime of bistability (coexistence of a stable focus point and a stable limit cycle) if the model’s initial condition is close to the tonic firing state and noise is sufficiently weak to avoid transitions to the stable fixed point.

Besides providing the general correlation structure, our theory relates the correlation statistics to properties of the deterministic neuron: its phase response curves with respect to perturbations in the voltage and in the adaptation current and the Green’s function of the adaptation variable. These functions change in a complicated and intertwined manner when the adaptation parameters are changed. In particular, we can change the neuron from type I (for small or vanishing subthreshold adaptation) having a positive PRC at all phases to type II (for strong subthreshold adaptation) having a PRC that is negative at early phases. These are the changes that also contribute to the qualitative change of the correlation pattern from negative ISI correlations for spike-triggered adaptation to positive ISI correlations for dominating subthreshold adaptation.

The results achieved in our paper may be useful in interpreting existing studies. Prescott and Sejnowski (2008) explored the role of subthreshold and spike-triggered adaptation for the signal transmission in conductance-based model neurons. These authors observed that only with spike-triggered adaptation strong negative ISI correlations are present, whereas for dominating subthreshold adaptation either only weak positive or vanishing ISI correlations are possible. Although they ascribed the positive correlations to the fact that a colored Ornstein-Uhlenbeck process was used as a stimulus of the model neuron, we have seen here that uncorrelated fluctuations combined with pure or dominating subthreshold adaptation can evoke positive ISI correlations.

Our results may be applied and extended in several directions. First of all, our theory provides a possible explanation for correlation patterns of spiking neurons in vivo. Importantly, the mechanism for nonrenewal spiking we have discussed here is a cellular mechanism (not a network mechanism) and thus should be present both in vivo as well as in vitro. Because so far positive ISI correlations have been mostly associated with correlations of the input, providing an alternative explanation based on cellular adaptation mechanisms is thus useful — although yet different mechanisms may exist that also lead to similar ISI correlations.

Secondly, our formulas for CV and SCC may be used as an additional tool for fitting parameters of the aEIF model to in vivo or in vitro data. White-noise stimuli are nowadays routinely applied to neurons in vitro and the resulting CV and ISI correlations can be useful as a mean to verify a parameter fit.

Thirdly, it is promising to apply our formulas to the analysis of the long-term variability of the count statistics. In fact, the CV and SCCs we have derived allow to compute the asymptotic limit of the spike count’s Fano factor (Cox and Lewis 1966)

$$ F_{\infty}=\lim_{t\to\infty} \frac{\langle {\Delta} N^{2}(t) \rangle}{\langle N(t) \rangle}={C_{v}^{2}}\left[1+2\sum\limits_{k=1}^{\infty} \rho_{k}\right]. $$
(33)

According to this formula, positive ISI correlations lead to an increase in the Fano factor, whereas negative correlations may strongly reduce it. Consequently, the detection of a static stimulus (based on estimates of spike counts) can be facilitated by negative correlations and will be deteriorated by positive ISI correlations. These effects are well-known from numerical simulations (Chacron et al. 2001; Ratnam and Nelson 2000; Prescott and Sejnowski 2008) and simplified models (Chacron et al. 2004; Nikitin et al. 2012). In the framework of our theory, the effect of ISI correlations on signal detection can be inspected and discussed for a biophysically reasonable point neuron model.