DI was estimated in a model free manner using the computationally efficient Kraskov, Stögbauer, and Grassberger (KSG) estimator (Kraskov et al., 2004; Gao et al., 2018), which is based on a k-nearest neighbors (kNN) approach to measuring mutual information (MI). DI was estimated between every pair of channels in each SA and AA time window in a data-driven manner. Every trial was considered to be an independent sample path of an unknown underlying random process, and pairwise DI was estimated using all the trials in a given window. Given two raw time series from a pair of electrodes X1N={X1,X2,. . . ,XN} and Y1N={Y1,Y2,. . . ,YN}, where Xi,YiR, DI from X1N to Y1N is denoted as I(X1NY1N), and is defined as the following:

where the right hand side of Equation 2 is the conditional MI between time series X1i and single sample point Yi, conditioned on the past i – 1 samples Y1i1. DI can also be expressed as sums of conditional differential entropies given by h’s in Equation 2. By definition, differential entropy h(Z) of a continuous random variable Z with a probability density function f(z) is the following:

Also, a conditional differential entropy term can be expressed as a difference of two differential entropies:

From Equations 2, 4, DI can be expanded as the following:

Each entropy term in Equation 5 was estimated using the KSG estimator (Kraskov et al., 2004), which uses a kNN approach, similar to the methodology described in Murin (2017). The implementation of DI was written in MATLAB using the kNN tools from Trentool (Lindner, 2011; Lindner et al., 2011). ECoG data were assumed to be Markovian of order m, i.e., samples only depend on the past m samples. Based on a non-parametric method of estimating memory order for ECoG (Murin et al., 2019), and from other similar work (Malladi et al., 2016; Murin et al., 2016), a memory order of 150 ms was used, achieved by using downsamples of the data for estimation. The final equation used for estimation of DI rate I^(X1N~Y1N~) is given by:

where m is the number of past samples, X~ and Y~ are the downsampled versions of X and Y, respectively, h^’ s are the estimated differential entropies, and N~ is the length of the downsampled signal.

Note: The content above has been extracted from a research article, so it may not display correctly.



Q&A
Please log in to submit your questions online.
Your question will be posted on the Bio-101 website. We will send your questions to the authors of this protocol and Bio-protocol community members who are experienced with this method. you will be informed using the email address associated with your Bio-protocol account.



We use cookies on this site to enhance your user experience. By using our website, you are agreeing to allow the storage of cookies on your computer.