3.1 Large-scale Granger causality analysis

AD Adora M. DSouza
AA Anas Z. Abidin
LL Lutz Leistritz
AW Axel Wismüller
request Request a Protocol
ask Ask a question
Favorite

We use large-scale Granger causality (lsGC) [3] to obtain a measure of directed multivariate information flow between every pair of regional time-series in the resting-state human brain. The principle of lsGC is derived from that of Granger causality [5, 41], which states that if the prediction quality of time-series xs improves in the presence of time-series xr, then xr Granger causes xs. Large-scale Granger causality comprises of estimating a time point in the future, using vector auto-regressive (VAR) modelling (of order d), in a dimension-reduced space defined by retaining the first c Principal Components (PC) of the time-series ensemble. These predictions are transformed back to the original time-series ensemble space using the inverse of the linear transformation function in the initial rotation of the axes to the PC space. The prediction errors, i.e. residuals, are used to obtain the lsGC coefficients. We calculate the variance of the residuals, when the full ensemble was used for obtaining the PCs, and compare these with those residuals obtained when information of one time-series is removed. We thus obtain a matrix of lsGC coefficients, containing measures of directed multivariate information flow scores for every pair of time series.

Consider an ensemble of time-series X  ∈ ℝN×T, where N and T are the number of time-series and the number of temporal samples respectively. Let X = (x1, x2, x3, …, xN)T be the whole multidimensional system, xn  ∈ ℝT a single time-series with n ∈ {1,2, …, N}, where xn = (xn(1), xn(2), …, xn(T)). Large-scale Granger causality differs from cGC by performing Granger causality analysis in an embedded lower dimension space. This is carried out by obtaining a multivariate GC score by first decomposing X into its first c high-variance principal components Z  ∈ ℝc×Tusing Principal Component Analysis (PCA), i.e.,

Subsequently, Z is modelled using Vector Auto-Regressive (VAR) modelling of order d and the estimate Z^ is computed by using the d auto-regression parameter matrices, each of size c × c. To obtain the influence of time-series xr on all other time-series, we remove the information of xr from the transformation matrix W, obtain ZX\xr and model Z^X\xr as its VAR estimate.

The two estimates Z^ and Z^X\xr are projected into the original high-dimensional space using the respective inverse PCA transformation. Following this, the errors, in the original high dimensional space, E and EX\xr, are computed after which we quantify the prediction quality by comparing the variance of the prediction errors obtained with and without consideration of xr. If the variance of the prediction error of a given time series, say xs, decreases with using xr, then we say that xr Granger-causes xs [4].

Fxrxsis the GC index for the influence of xr on xs, which is stored in the affinity matrix A at position A(s, r). exs,X\xris the error in predicting xs, when xr was not considered, and exsis the error when xr was used.

To quantitatively evaluate the performance of our lsGC method, we compare this approach to a conditional Granger causality method (referred to as cGC here) [4] from the literature, which does not require any low-dimensional embedding and obtains GC coefficients in the original ensemble space. For cGC, Z = X where W is the identity matrix. However, the need for lsGC becomes relevant, when N and T are very similar or when N > T, where N is the size of the ensemble, i.e. the number of time-series, and T is the length of the series. The quantities N and d both increases the number of parameters to be computed, whose estimation is limited by length T. The lsGC method alleviates this problem, as it incorporates a reversible dimension reduction step by decreasing the number of free parameters to be estimated for time-series modelling.

Do you have any questions about this protocol?

Post your question to gather feedback from the community. We will also invite the authors of this article to respond.

0/150

tip Tips for asking effective questions

+ Description

Write a detailed description. Include all information that will help others answer your question including experimental processes, conditions, and relevant images.

post Post a Question
0 Q&A